06-03-2025 02:21 PM
3 weeks ago
Hi Lou,
Thank you so much for your detailed reply, and I apologize for leaving this open for so long. I got wrapped up in another project and am just getting back to this.
I was able to resolve it, at least in my situation, last night, so I wanted to add what I found to this thread in case it helps someone else.
It used to be the case that you had to create the Access Connector for Databricks manually, then give it Storage Blob Data Contributor in the Storage Account (Azure). Now, when you create a Databricks Workspace, it automatically creates the Access Connector (I have a related gripe with the way it names them - if you have multiple workspaces, theres no easy way to tell which is for which without hovering over it to see the full description). We assumed this could be used and was attached to the Databricks workspace. We made sure that this auto-generate Access Connector had Storage Blob Data Contributor to the new Storage Account.
When I tried to create the table with a defined EXTERNAL LOCATION, I got an error message that said "The credential <credential_name> is a workspace default credential that is only allowed to access data in the following paths <default_path>."
This made me realize that the Access Connectors that Databricks is creating are not allowed to be used on other Storage Accounts. This is pretty odd to me, but it seems to be the case.
When I created a new Access Connector, tied to this same Databricks Workspace, then gave it Storage Blob Data Contributor, everything works as expected.
Thanks!
Seth Parker
06-04-2025 04:37 AM
Although not formal support, here are some things to consider as you troubleshoot the problem:
This issue sounds like it stems from a Unity Catalog external location configuration mismatch or permission issue, despite the external location itself being reachable. Here’s a structured checklist to help you troubleshoot the [TABLE_DOES_NOT_EXIST.RESOURCE_DOES_NOT_EXIST] error when creating a table:
✅ Workspace is reachable and accessible
✅ External location is created and points to a valid container
✅ Networking on the Storage Account is open to all endpoints
✅ Volume creation on the same location works (implying access is technically fine)
✅ Table creation without an external location works
The storage credential is of the correct type (e.g., managed identity or service principal).
It has permission (Storage Blob Data Contributor) on the container and the parent storage account.
Make sure the external location is correctly mapped to a valid storage credential, and that:
Run this SQL in a Databricks notebook in the new workspace to inspect:
DESCRIBE EXTERNAL LOCATION `<your_external_location_name>`;
Ensure that your Unity Catalog catalog is correctly referencing the external location:
DESCRIBE CATALOG `<your_catalog_name>`;
USE CATALOG <catalog>
CREATE TABLE privilege on the catalog or schema level
Also check if the user (or the group they belong to) has both of these:
Go to Admin Console → Unity Catalog → Metastore assignments
Make sure the new workspace is correctly assigned and not in an inconsistent state
Confirm the workspace is attached to the correct Unity Catalog metastore:
If you are doing a CREATE TABLE (non-managed), be sure you are not mixing paths or missing specifications. This can trip up UC-managed external catalogs.
Try creating the table with an explicit path override just to test:
CREATE TABLE my_catalog.my_schema.my_table (id INT)
LOCATION 'abfss://<container>@<account>.dfs.core.windows.net/<some_subpath>';
Manually create the schema directory, or try:
If the schema (aka “database”) directory does not yet exist in the storage account (under /catalog_name/schema_name/), Databricks may fail silently or with confusing errors.
CREATE SCHEMA my_catalog.my_schema;
Try checking the table metadata in UC using:
Unity Catalog uses a specific metadata path structure. If the UC metastore cannot reconcile the metadata with the storage path, it may show RESOURCE_DOES_NOT_EXIST.
SHOW TABLES IN my_catalog.my_schema;
Try recreating the external location using the Databricks UI and make sure it’s validated.
Double check the workspace identity (Managed Identity or Service Principal) is the one configured in the storage credential and has access to the container.
If all else fails: Try creating the same setup in a test workspace with minimal security configs to rule out weird propagation or policy issues.
Based on your description, here are the likely culprits:
Unity Catalog is unable to register or recognize the metadata for the new table due to a missing schema directory or improper mapping.
The external location credential identity has storage access but not metastore access.
The workspace might not have fully propagated Unity Catalog setup after being added to the metastore.
Hope this sets you in the right direction.
Cheers, Lou.
3 weeks ago
Hi Lou,
Thank you so much for your detailed reply, and I apologize for leaving this open for so long. I got wrapped up in another project and am just getting back to this.
I was able to resolve it, at least in my situation, last night, so I wanted to add what I found to this thread in case it helps someone else.
It used to be the case that you had to create the Access Connector for Databricks manually, then give it Storage Blob Data Contributor in the Storage Account (Azure). Now, when you create a Databricks Workspace, it automatically creates the Access Connector (I have a related gripe with the way it names them - if you have multiple workspaces, theres no easy way to tell which is for which without hovering over it to see the full description). We assumed this could be used and was attached to the Databricks workspace. We made sure that this auto-generate Access Connector had Storage Blob Data Contributor to the new Storage Account.
When I tried to create the table with a defined EXTERNAL LOCATION, I got an error message that said "The credential <credential_name> is a workspace default credential that is only allowed to access data in the following paths <default_path>."
This made me realize that the Access Connectors that Databricks is creating are not allowed to be used on other Storage Accounts. This is pretty odd to me, but it seems to be the case.
When I created a new Access Connector, tied to this same Databricks Workspace, then gave it Storage Blob Data Contributor, everything works as expected.
Thanks!
Seth Parker
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now