cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

4Twannie
by New Contributor II
  • 298 Views
  • 4 replies
  • 2 kudos

Delta Sharing from Databricks to SAP BDC fails with invalid_client error

ContextWe are in the process of extracting data between SAP BDC Datasphere and Databricks (Brownfield Implementation).SAP Datasphere is hosted in AWS (eu10)Databricks is hosted in Azure (West Europe)The BDC Connect System is located in the same regio...

  • 298 Views
  • 4 replies
  • 2 kudos
Latest Reply
anshu_roy
Databricks Employee
  • 2 kudos

The error DELTA_SHARING_INVALID_RECIPIENT_AUTH refers to an invalid authorization specification when accessing Delta Sharing resources. This maps to SQLSTATE code 28000 ("invalid authorization specification") and typically occurs when the recipient's...

  • 2 kudos
3 More Replies
hietpas
by New Contributor
  • 45 Views
  • 1 replies
  • 0 kudos

PERMISSION_DENIED: Request for user delegation key is not authorized.

I am attempting to copy files from an Azure Storage container using an Azure Databricks Volume. When attempting to list files using dbutils.fs.ls('/Volumes/myCatalog/mySchema/myVolume' I get the following error:ExecutionError: (com.databricks.sql.man...

  • 45 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @hietpas ,I think your access connector doesn't have sufficient permission to storage account. Check below documentation entry. Try to grant Storage Blob Data Contributor role for your connector. 

  • 0 kudos
PiotrM
by New Contributor III
  • 4227 Views
  • 5 replies
  • 3 kudos

Drop table - permission management

Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...

  • 4227 Views
  • 5 replies
  • 3 kudos
Latest Reply
PiotrM
New Contributor III
  • 3 kudos

Hey,any news about this functionality being implemented? Br

  • 3 kudos
4 More Replies
APJESK
by New Contributor III
  • 47 Views
  • 1 replies
  • 0 kudos

Best practices for 3-layer access control in Databricks

Identity and access management model for Databricks and want to implement a clear 3-layer authorization approach:Account level: Account RBAC roles (account admin, metastore admin, etc.)Workspace level: Workspace roles/entitlements + workspace ACLs (c...

  • 47 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

Great question. While there are some general best practices, a lot of it comes down to how your organization already does some type of governance, when it comes to "deployment" and also "data governance". For example Org1 might not already have a pro...

  • 0 kudos
APJESK
by New Contributor III
  • 77 Views
  • 1 replies
  • 0 kudos

Regarding - Managed vs External volumes and tables

From a creation perspective, the steps for managed and external volumes appear almost identical:Both require a storage credentialBoth require an external locationBoth point to customer-owned S3So what exactly makes a volume “managed” vs “external”?Wh...

  • 77 Views
  • 1 replies
  • 0 kudos
Latest Reply
themahesh
New Contributor
  • 0 kudos

Managed and external volumes may look the same because both store data in the customer’s S3 and use customer IAM roles. However, the real difference is who controls the data folder.With a managed volume, Databricks creates the folder in S3 and contro...

  • 0 kudos
JoaoPigozzo
by New Contributor III
  • 121 Views
  • 2 replies
  • 2 kudos

Unity Catalog design in single workspace: dev/prod catalogs and schemas for projects — should we add

Hello everyone,We are currently designing our Unity Catalog structure and would like feedback on whether our approach makes sense and how it could be improved.Context:We use a single Databricks workspace shared by Data Engineering and Data Science/ML...

  • 121 Views
  • 2 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Hey @JoaoPigozzo  — great question. This one comes up all the time with the customers I train. I’ve been doing this for quite a while now and have had the chance to see a wide range of implementations and approaches out in the wild. While there’s no ...

  • 2 kudos
1 More Replies
ckunal_eng
by New Contributor
  • 53 Views
  • 1 replies
  • 0 kudos

Need help with changing RunAs owner

Dear Contributors,I have a scenario where I am required to trigger a couple of jobs (lets say 4) based on the success of a master job. The master job has a Service Principal attached to it, which is the owner. It is going to produce a list of items w...

  • 53 Views
  • 1 replies
  • 0 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 0 kudos

@ckunal_eng - One single Databricks Job run cannot dynamically change its "Run As" identity during execution. Rather you will need a pattern that separates the triggering identity from the executing identity.I would pre-configure 4 dependent jobs wit...

  • 0 kudos
BigAlThePal
by New Contributor III
  • 68 Views
  • 1 replies
  • 0 kudos

Azure DevOps Release (CD) pipeline - Databricks tasks no longer available

Hello and happy new year everyone.We've noticed that our Azure DevOps Release (CD) pipelines have got all of their Databricks tasks uninstalled, and we cannot find them in the marketplace anymore. The author for both is Microsoft DevLabsWe mainly rel...

BigAlThePal_0-1767797591922.png BigAlThePal_3-1767797856884.png BigAlThePal_1-1767797625328.png BigAlThePal_2-1767797798159.png
  • 68 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @BigAlThePal ,They could remove it. This extension was marked long time ago as depracted:Alternative to this extension (due to deprecation?) · Issue #50 · microsoft/azdo-databricksDevOps for Azure Databricks - Visual Studio MarketplaceLast commit ...

  • 0 kudos
Valerio
by New Contributor
  • 146 Views
  • 1 replies
  • 0 kudos

GitHub Actions OIDC with Databricks: wildcard subject for pull_request workflows

Hi,I’m configuring GitHub Actions OIDC authentication with Databricks following the official documentation:https://docs.databricks.com/aws/en/dev-tools/auth/provider-githubWhen running a GitHub Actions workflow triggered by pull_request, authenticati...

  • 146 Views
  • 1 replies
  • 0 kudos
Latest Reply
bianca_unifeye
Contributor
  • 0 kudos

 There isn’t a Databricks-side wildcard subject pattern to solve this. Databricks service principal federation policies don’t support wildcard / glob / regex matching on subject today, the policy’s oidc_policy.subject is effectively an exact match ag...

  • 0 kudos
Charuvil
by New Contributor III
  • 99 Views
  • 3 replies
  • 3 kudos

Resolved! How to disable storage account key access of workspace storage accounts?

Each Azure Databricks workspace has an associated Azure storage account in a managed resource group known as the workspace storage account. The workspace storage account includes workspace system data (job output, system settings, and logs), DBFS roo...

  • 99 Views
  • 3 replies
  • 3 kudos
Latest Reply
Charuvil
New Contributor III
  • 3 kudos

Thank you @Raman_Unifeye  and @nayan_wylde  it helped

  • 3 kudos
2 More Replies
arun_prakash
by New Contributor
  • 1118 Views
  • 3 replies
  • 0 kudos

Databricks Apps not working in postman

I have a question regarding Databricks Apps. I have deployed my databricks Apps, and its working on my laptop, but when I try to open the same url in my mobile its redirecting to databricks signin page, and also its not working through postman as wel...

  • 1118 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Is this issue specifically with the Databricks Apps am I right? Are you getting and erro message?

  • 0 kudos
2 More Replies
margarita_shir
by New Contributor II
  • 350 Views
  • 1 replies
  • 1 kudos

Databricks AWS deployment with custom configurations (workspace root storage)

Hi everyone,I have a question about the IAM role for workspace root storage when deploying Databricks on AWS with custom configurations (customer-managed VPC, storage configurations, credential configurations, etc.).At an earlier stage of our deploym...

  • 350 Views
  • 1 replies
  • 1 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 1 kudos

Great question. [1] In the pre-UC world when you created a workspace you would designate a bucket/container that was used for what was most commonly known as DBFS. ie it's where the hive-metastore managed tables would be stored by default along with ...

  • 1 kudos
hgintexas
by New Contributor II
  • 284 Views
  • 3 replies
  • 2 kudos

system.lakeflow.job_task_run_timeline table missing task parameters on for each loop input

The system.lakeflow.job_task_run_timeline does not include the task level parameters on the input of the for each loop if dynamically setting the parameter in another notebook using dbutils.jobs.taskValues.set. This information is not included in the...

  • 284 Views
  • 3 replies
  • 2 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 2 kudos

Hi @hgintexas , You’re right, the system job timeline tables and the Runs API don’t currently surface the resolved per‑iteration inputs for a For-each task when those inputs are sourced via task values set in another notebook with dbutils.jobs.taskVa...

  • 2 kudos
2 More Replies
Giuseppe_C
by New Contributor
  • 374 Views
  • 2 replies
  • 0 kudos

Databricks import directory false positive import

Hello evryone,I'm using databricks CLI to moving several directories from Azure Repos to Databricks Workspace.The problem is that files are not updating properly, with no error to display.The self-hosted agent in pipeline i'm using has installed the ...

  • 374 Views
  • 2 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Hi @Giuseppe_C , Can you try running the command with the --debug flag to see if there is any additional information that you can gather? Full command: databricks import-dir <source> <dest> --overwrite --debugAlso, verify that the target path is not ...

  • 0 kudos
1 More Replies
jkdatabricks
by New Contributor
  • 1872 Views
  • 3 replies
  • 0 kudos

Error: PERMISSION_DENIED: AWS IAM role does

Hello, We are trying to setup a new workspace. However we are getting following error.  Workspace failed to launch.Error: PERMISSION_DENIED: AWS IAM role does not have READ permissions on url s3://jk-databricks-prods3/unity-catalog/742920957025975.Pl...

  • 1872 Views
  • 3 replies
  • 0 kudos
Latest Reply
dbxdev
New Contributor II
  • 0 kudos

I am assuming you are creting this manually and not using terraform .Is s3://jk-databricks-prods3/unity-catalog/742920957025975 the external storage location for your UC Metastore . Can you check if the storage credential used by this external storag...

  • 0 kudos
2 More Replies