cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Permission Denied when Creating External Tables Using Workspace Default Credential

j_unspeakable
New Contributor III

I’m building out schemas, volumes, and external Delta tables in Unity Catalog via Terraform. The schemas and volumes are created successfully, but all external tables are failing.

The error message from Terraform doesn't highlight what the issue is but returns just (Error: cannot create sql table: statement failed to execute: FAILED)

I tested manually in the Databricks SQL editor with:

CREATE TABLE catalog.schema.fr012
USING DELTA
LOCATION 'abfss://nexus-dev@stinfinasdevuks.dfs.core.windows.net/gold/table/fr002';

and I got this error:

PERMISSION_DENIED: The credential 'dbw_infinas_dev_uks' is a workspace default credential that is only allowed to access data in the following paths:
'abfss://unity-catalog-storage@dbstoragefkilqgly2oevu.dfs.core.windows.net/1390520157504627'.
Please ensure that any path accessed using this credential is under one of these paths.

However, I was able to create and browse the same External Location in the Unity Catalog UI using the same default credential.

I did some research and found that workspace default credentials in Databricks are not exclusively scoped to Databricks' managed storage paths. While they can be used to access managed storage, they can also be used to access external storage locations as well, particularly when leveraging storage credentials and external locations.

Questions:

  1. Is it expected that the workspace default credential can access an External Location but not be allowed to use it in external table creation?

  2. Will creating a separate storage credential and external location (backed by an SPN/MANAGED_IDENTITY) resolve the table-creation issue?

  3. I found examples stating default credentials can access external data—are these outdated, or am I missing steps?


I’ve attached screenshots of:

  • The External Location with the default credential

  • The PERMISSION_DENIED error messageimage.pngimage.pngScreenshot 2025-06-15 152848.png

 

1 ACCEPTED SOLUTION

Accepted Solutions

j_unspeakable
New Contributor III

I was able to fix this by creating a new Access Connector for Azure Databricks, assigning the appropriate permission to the storage account, creating a new storage credential and using the credential to register my external location.

https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-crede... 

View solution in original post

1 REPLY 1

j_unspeakable
New Contributor III

I was able to fix this by creating a new Access Connector for Azure Databricks, assigning the appropriate permission to the storage account, creating a new storage credential and using the credential to register my external location.

https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-crede...