- 298 Views
- 4 replies
- 2 kudos
Delta Sharing from Databricks to SAP BDC fails with invalid_client error
ContextWe are in the process of extracting data between SAP BDC Datasphere and Databricks (Brownfield Implementation).SAP Datasphere is hosted in AWS (eu10)Databricks is hosted in Azure (West Europe)The BDC Connect System is located in the same regio...
- 298 Views
- 4 replies
- 2 kudos
- 2 kudos
The error DELTA_SHARING_INVALID_RECIPIENT_AUTH refers to an invalid authorization specification when accessing Delta Sharing resources. This maps to SQLSTATE code 28000 ("invalid authorization specification") and typically occurs when the recipient's...
- 2 kudos
- 45 Views
- 1 replies
- 0 kudos
PERMISSION_DENIED: Request for user delegation key is not authorized.
I am attempting to copy files from an Azure Storage container using an Azure Databricks Volume. When attempting to list files using dbutils.fs.ls('/Volumes/myCatalog/mySchema/myVolume' I get the following error:ExecutionError: (com.databricks.sql.man...
- 45 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @hietpas ,I think your access connector doesn't have sufficient permission to storage account. Check below documentation entry. Try to grant Storage Blob Data Contributor role for your connector.
- 0 kudos
- 4227 Views
- 5 replies
- 3 kudos
Drop table - permission management
Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...
- 4227 Views
- 5 replies
- 3 kudos
- 3 kudos
Hey,any news about this functionality being implemented? Br
- 3 kudos
- 47 Views
- 1 replies
- 0 kudos
Best practices for 3-layer access control in Databricks
Identity and access management model for Databricks and want to implement a clear 3-layer authorization approach:Account level: Account RBAC roles (account admin, metastore admin, etc.)Workspace level: Workspace roles/entitlements + workspace ACLs (c...
- 47 Views
- 1 replies
- 0 kudos
- 0 kudos
Great question. While there are some general best practices, a lot of it comes down to how your organization already does some type of governance, when it comes to "deployment" and also "data governance". For example Org1 might not already have a pro...
- 0 kudos
- 77 Views
- 1 replies
- 0 kudos
Regarding - Managed vs External volumes and tables
From a creation perspective, the steps for managed and external volumes appear almost identical:Both require a storage credentialBoth require an external locationBoth point to customer-owned S3So what exactly makes a volume “managed” vs “external”?Wh...
- 77 Views
- 1 replies
- 0 kudos
- 0 kudos
Managed and external volumes may look the same because both store data in the customer’s S3 and use customer IAM roles. However, the real difference is who controls the data folder.With a managed volume, Databricks creates the folder in S3 and contro...
- 0 kudos
- 121 Views
- 2 replies
- 2 kudos
Unity Catalog design in single workspace: dev/prod catalogs and schemas for projects — should we add
Hello everyone,We are currently designing our Unity Catalog structure and would like feedback on whether our approach makes sense and how it could be improved.Context:We use a single Databricks workspace shared by Data Engineering and Data Science/ML...
- 121 Views
- 2 replies
- 2 kudos
- 2 kudos
Hey @JoaoPigozzo — great question. This one comes up all the time with the customers I train. I’ve been doing this for quite a while now and have had the chance to see a wide range of implementations and approaches out in the wild. While there’s no ...
- 2 kudos
- 53 Views
- 1 replies
- 0 kudos
Need help with changing RunAs owner
Dear Contributors,I have a scenario where I am required to trigger a couple of jobs (lets say 4) based on the success of a master job. The master job has a Service Principal attached to it, which is the owner. It is going to produce a list of items w...
- 53 Views
- 1 replies
- 0 kudos
- 0 kudos
@ckunal_eng - One single Databricks Job run cannot dynamically change its "Run As" identity during execution. Rather you will need a pattern that separates the triggering identity from the executing identity.I would pre-configure 4 dependent jobs wit...
- 0 kudos
- 68 Views
- 1 replies
- 0 kudos
Azure DevOps Release (CD) pipeline - Databricks tasks no longer available
Hello and happy new year everyone.We've noticed that our Azure DevOps Release (CD) pipelines have got all of their Databricks tasks uninstalled, and we cannot find them in the marketplace anymore. The author for both is Microsoft DevLabsWe mainly rel...
- 68 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @BigAlThePal ,They could remove it. This extension was marked long time ago as depracted:Alternative to this extension (due to deprecation?) · Issue #50 · microsoft/azdo-databricksDevOps for Azure Databricks - Visual Studio MarketplaceLast commit ...
- 0 kudos
- 146 Views
- 1 replies
- 0 kudos
GitHub Actions OIDC with Databricks: wildcard subject for pull_request workflows
Hi,I’m configuring GitHub Actions OIDC authentication with Databricks following the official documentation:https://docs.databricks.com/aws/en/dev-tools/auth/provider-githubWhen running a GitHub Actions workflow triggered by pull_request, authenticati...
- 146 Views
- 1 replies
- 0 kudos
- 0 kudos
There isn’t a Databricks-side wildcard subject pattern to solve this. Databricks service principal federation policies don’t support wildcard / glob / regex matching on subject today, the policy’s oidc_policy.subject is effectively an exact match ag...
- 0 kudos
- 99 Views
- 3 replies
- 3 kudos
Resolved! How to disable storage account key access of workspace storage accounts?
Each Azure Databricks workspace has an associated Azure storage account in a managed resource group known as the workspace storage account. The workspace storage account includes workspace system data (job output, system settings, and logs), DBFS roo...
- 99 Views
- 3 replies
- 3 kudos
- 3 kudos
Thank you @Raman_Unifeye and @nayan_wylde it helped
- 3 kudos
- 1118 Views
- 3 replies
- 0 kudos
Databricks Apps not working in postman
I have a question regarding Databricks Apps. I have deployed my databricks Apps, and its working on my laptop, but when I try to open the same url in my mobile its redirecting to databricks signin page, and also its not working through postman as wel...
- 1118 Views
- 3 replies
- 0 kudos
- 0 kudos
Is this issue specifically with the Databricks Apps am I right? Are you getting and erro message?
- 0 kudos
- 350 Views
- 1 replies
- 1 kudos
Databricks AWS deployment with custom configurations (workspace root storage)
Hi everyone,I have a question about the IAM role for workspace root storage when deploying Databricks on AWS with custom configurations (customer-managed VPC, storage configurations, credential configurations, etc.).At an earlier stage of our deploym...
- 350 Views
- 1 replies
- 1 kudos
- 1 kudos
Great question. [1] In the pre-UC world when you created a workspace you would designate a bucket/container that was used for what was most commonly known as DBFS. ie it's where the hive-metastore managed tables would be stored by default along with ...
- 1 kudos
- 284 Views
- 3 replies
- 2 kudos
system.lakeflow.job_task_run_timeline table missing task parameters on for each loop input
The system.lakeflow.job_task_run_timeline does not include the task level parameters on the input of the for each loop if dynamically setting the parameter in another notebook using dbutils.jobs.taskValues.set. This information is not included in the...
- 284 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @hgintexas , You’re right, the system job timeline tables and the Runs API don’t currently surface the resolved per‑iteration inputs for a For-each task when those inputs are sourced via task values set in another notebook with dbutils.jobs.taskVa...
- 2 kudos
- 374 Views
- 2 replies
- 0 kudos
Databricks import directory false positive import
Hello evryone,I'm using databricks CLI to moving several directories from Azure Repos to Databricks Workspace.The problem is that files are not updating properly, with no error to display.The self-hosted agent in pipeline i'm using has installed the ...
- 374 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Giuseppe_C , Can you try running the command with the --debug flag to see if there is any additional information that you can gather? Full command: databricks import-dir <source> <dest> --overwrite --debugAlso, verify that the target path is not ...
- 0 kudos
- 1872 Views
- 3 replies
- 0 kudos
Error: PERMISSION_DENIED: AWS IAM role does
Hello, We are trying to setup a new workspace. However we are getting following error. Workspace failed to launch.Error: PERMISSION_DENIED: AWS IAM role does not have READ permissions on url s3://jk-databricks-prods3/unity-catalog/742920957025975.Pl...
- 1872 Views
- 3 replies
- 0 kudos
- 0 kudos
I am assuming you are creting this manually and not using terraform .Is s3://jk-databricks-prods3/unity-catalog/742920957025975 the external storage location for your UC Metastore . Can you check if the storage credential used by this external storag...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
63 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 121 | |
| 42 | |
| 37 | |
| 30 | |
| 25 |