- 311 Views
- 3 replies
- 1 kudos
Best practices for 3-layer access control in Databricks
Identity and access management model for Databricks and want to implement a clear 3-layer authorization approach:Account level: Account RBAC roles (account admin, metastore admin, etc.)Workspace level: Workspace roles/entitlements + workspace ACLs (c...
- 311 Views
- 3 replies
- 1 kudos
- 1 kudos
Here is a high level RACI chart.CapabilityPlatform AdminsData Stewards (Domain)Data Engineers (Domain)Analysts/BISecurity/ComplianceAccount setup / workspacesR/ACIICMetastore / locations / credsR/ACIICCatalog/Schema design (per domain)IR/ACICGrants (...
- 1 kudos
- 2620 Views
- 13 replies
- 13 kudos
Resolved! I need a switch to turn off Data Apps in databricks workspaces
HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...
- 2620 Views
- 13 replies
- 13 kudos
- 13 kudos
Thanks @Louis_Frolio additionally, i have one question.you said that Accoount levels admins has apps create or manage authority.where is the option in Account console?
- 13 kudos
- 221 Views
- 2 replies
- 2 kudos
Resolved! The Lakeflow connect Gateway setup, do we need to install the agent on-prem?
The Lakeflow connect Gateway setup to connect on-prem SQL serverPlease provide the steps the setup the gateway agent on the on-prem? Where to download this agent? What are the firewall rules for outbound looks like. Kind regards,Asha
- 221 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Ashash12 ,You need to have proper network connectivity to your on premise SQL Server. As they stated in the docs - connector supports SQL Server on-premises using Azure ExpressRoute and AWS Direct Connect networkinghttps://docs.databricks.com/aws...
- 2 kudos
- 410 Views
- 2 replies
- 0 kudos
Queries Hanging Indefinitely
I spun up a databricks environment on AWS via the AWS marketplace.All the required infrastructure such as S3, VPC, Subnets are automatically created during the processOnce I get the Databricks environment up and running - I created a cluster. I attac...
- 410 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, I believe this is happening as you haven't got the right ports open to connect between your classic compute and the UC Metatstore. When you try to select 1 it works as it doesn't need to talk to the metastore but when you do show catalogs it is t...
- 0 kudos
- 407 Views
- 3 replies
- 0 kudos
Resolved! PERMISSION_DENIED: Request for user delegation key is not authorized.
I am attempting to copy files from an Azure Storage container using an Azure Databricks Volume. When attempting to list files using dbutils.fs.ls('/Volumes/myCatalog/mySchema/myVolume' I get the following error:ExecutionError: (com.databricks.sql.man...
- 407 Views
- 3 replies
- 0 kudos
- 0 kudos
@szymon_dybczak Can you provide a link to the documentation you noted? I confirmed this with my own testing, that the Storage Blob Delegator role must be at the ADLS account-level, and Storage Blob Data Reader can then be applied at the container-lev...
- 0 kudos
- 180 Views
- 2 replies
- 0 kudos
Need the cost explorer
We would like each workspace admin to be able to easily check the cost of their own workspace for free.Currently, we have a Usage dashboard, but it shows account-level costs and requires a SQL Warehouse to view. This means that additional costs are i...
- 180 Views
- 2 replies
- 0 kudos
- 0 kudos
Databricks currently does not provide a free, workspace‑level cost viewer—all usage/cost dashboards are account‑level and require a SQL Warehouse, which does create cost.So your request is valid and aligns with what many customers want.Here are few o...
- 0 kudos
- 164 Views
- 1 replies
- 2 kudos
How to connect to AWS Custom VPC endpoint
Hello,Could somebody help me with the connection issue to my VPC endpointI have created a customer-managed VPC in AWS and setup new workspace with that VPC. There is an RDS in other vpc that I want to connect from Databricks, and I have created a VP...
- 164 Views
- 1 replies
- 2 kudos
- 2 kudos
“Connection refused” means the TCP handshake reached your endpoint ENI and the backend actively rejected the connection. That’s different from a timeout (routing/DNS), so your PrivateLink plumbing and DNS are mostly correct. Short fixes you can try:...
- 2 kudos
- 134 Views
- 1 replies
- 0 kudos
force_destroy/force_update option for workspace APIs
The Unity catalog DELETE API supports the force option:https://docs.databricks.com/api/workspace/catalogs/deleteIs it possible to support the force option for the workspace APIs?https://docs.databricks.com/api/account/workspaces/deleteContext:At the ...
- 134 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @littlewat ,Currently this is not supported by API. You can raise feature request though
- 0 kudos
- 135 Views
- 0 replies
- 0 kudos
Databricks on AWS Marketplace – Unity Catalog & S3 Access Failing with SSL “Connection reset”
Hi All,I’m facing an issue accessing AWS S3 and Unity Catalog from a Databricks AWS Marketplace workspace.Problem:Whenever Databricks tries to access S3 or Unity Catalog, it fails with:javax.net.ssl.SSLException: Connection resetWhat works:Spark job...
- 135 Views
- 0 replies
- 0 kudos
- 4492 Views
- 5 replies
- 3 kudos
Drop table - permission management
Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...
- 4492 Views
- 5 replies
- 3 kudos
- 3 kudos
Hey,any news about this functionality being implemented? Br
- 3 kudos
- 212 Views
- 1 replies
- 0 kudos
Regarding - Managed vs External volumes and tables
From a creation perspective, the steps for managed and external volumes appear almost identical:Both require a storage credentialBoth require an external locationBoth point to customer-owned S3So what exactly makes a volume “managed” vs “external”?Wh...
- 212 Views
- 1 replies
- 0 kudos
- 0 kudos
Managed and external volumes may look the same because both store data in the customer’s S3 and use customer IAM roles. However, the real difference is who controls the data folder.With a managed volume, Databricks creates the folder in S3 and contro...
- 0 kudos
- 408 Views
- 2 replies
- 3 kudos
Resolved! Unity Catalog design in single workspace: dev/prod catalogs and schemas for projects — should we add
Hello everyone,We are currently designing our Unity Catalog structure and would like feedback on whether our approach makes sense and how it could be improved.Context:We use a single Databricks workspace shared by Data Engineering and Data Science/ML...
- 408 Views
- 2 replies
- 3 kudos
- 3 kudos
Hey @JoaoPigozzo — great question. This one comes up all the time with the customers I train. I’ve been doing this for quite a while now and have had the chance to see a wide range of implementations and approaches out in the wild. While there’s no ...
- 3 kudos
- 138 Views
- 1 replies
- 0 kudos
Need help with changing RunAs owner
Dear Contributors,I have a scenario where I am required to trigger a couple of jobs (lets say 4) based on the success of a master job. The master job has a Service Principal attached to it, which is the owner. It is going to produce a list of items w...
- 138 Views
- 1 replies
- 0 kudos
- 0 kudos
@ckunal_eng - One single Databricks Job run cannot dynamically change its "Run As" identity during execution. Rather you will need a pattern that separates the triggering identity from the executing identity.I would pre-configure 4 dependent jobs wit...
- 0 kudos
- 127 Views
- 1 replies
- 0 kudos
Azure DevOps Release (CD) pipeline - Databricks tasks no longer available
Hello and happy new year everyone.We've noticed that our Azure DevOps Release (CD) pipelines have got all of their Databricks tasks uninstalled, and we cannot find them in the marketplace anymore. The author for both is Microsoft DevLabsWe mainly rel...
- 127 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @BigAlThePal ,They could remove it. This extension was marked long time ago as depracted:Alternative to this extension (due to deprecation?) · Issue #50 · microsoft/azdo-databricksDevOps for Azure Databricks - Visual Studio MarketplaceLast commit ...
- 0 kudos
- 559 Views
- 1 replies
- 0 kudos
GitHub Actions OIDC with Databricks: wildcard subject for pull_request workflows
Hi,I’m configuring GitHub Actions OIDC authentication with Databricks following the official documentation:https://docs.databricks.com/aws/en/dev-tools/auth/provider-githubWhen running a GitHub Actions workflow triggered by pull_request, authenticati...
- 559 Views
- 1 replies
- 0 kudos
- 0 kudos
There isn’t a Databricks-side wildcard subject pattern to solve this. Databricks service principal federation policies don’t support wildcard / glob / regex matching on subject today, the policy’s oidc_policy.subject is effectively an exact match ag...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
64 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 122 | |
| 46 | |
| 37 | |
| 32 | |
| 25 |