Iām building out schemas, volumes, and external Delta tables in Unity Catalog via Terraform. The schemas and volumes are created successfully, but all external tables are failing.
The error message from Terraform doesn't highlight what the issue is but returns just (Error: cannot create sql table: statement failed to execute: FAILED)
I tested manually in the Databricks SQL editor with:
CREATE TABLE catalog.schema.fr012
USING DELTA
LOCATION 'abfss://nexus-dev@stinfinasdevuks.dfs.core.windows.net/gold/table/fr002';
and I got this error:
PERMISSION_DENIED: The credential 'dbw_infinas_dev_uks' is a workspace default credential that is only allowed to access data in the following paths:
'abfss://unity-catalog-storage@dbstoragefkilqgly2oevu.dfs.core.windows.net/1390520157504627'.
Please ensure that any path accessed using this credential is under one of these paths.
However, I was able to create and browse the same External Location in the Unity Catalog UI using the same default credential.
I did some research and found that workspace default credentials in Databricks are not exclusively scoped to Databricks' managed storage paths. While they can be used to access managed storage, they can also be used to access external storage locations as well, particularly when leveraging storage credentials and external locations.
Questions:
Is it expected that the workspace default credential can access an External Location but not be allowed to use it in external table creation?
Will creating a separate storage credential and external location (backed by an SPN/MANAGED_IDENTITY) resolve the table-creation issue?
I found examples stating default credentials can access external dataāare these outdated, or am I missing steps?
Iāve attached screenshots of: