cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error Creating Table

QLA_SethParker
New Contributor III
We are a current Databricks customer (Azure Databricks) experiencing an issue when creating a table.
 
We have an existing Metastore in the Central region.  All other Workspaces in this Metastore/Region are behind Private Endpoints.  We are trying to create a new Workspace that is not behind private endpoints, but is only available to specific VNets.  We created the new Workspace.  We created a new External Location pointed at a new Storage Account Container (the networking on the new Storage Account is open to all endpoints currently).
 
We created a new Catalog using this new External Location.  When we try to create a table in this new catalog, we get the following error message:
[RequestId=ea69c411-f03c-905c-8365-05903941a65c ErrorClass=TABLE_DOES_NOT_EXIST.RESOURCE_DOES_NOT_EXIST] Table '4e8d7f5a-2f5a-4dd1-b9b7-5c646628da86' does not exist.
 
When we look in the storage account, we see that the folder for this new table was created:
SethParker02_0-1748985494231.png

 

I believe the permissions in and networking to the Storage Account are correct.  I am also able to create a Volume against this Storage Container and add/remove files from the Volume.
 
It seems like there is an issue in either the Metastore itself or Unity Catalog.
 
As a note, if we create another Catalog without specifying an External Location, we are able to create a table without issue.
 
Does anyone have ideas of other things I can check that might cause this error?
1 ACCEPTED SOLUTION

Accepted Solutions

QLA_SethParker
New Contributor III

Hi Lou,

Thank you so much for your detailed reply, and I apologize for leaving this open for so long.  I got wrapped up in another project and am just getting back to this.

I was able to resolve it, at least in my situation, last night, so I wanted to add what I found to this thread in case it helps someone else.

It used to be the case that you had to create the Access Connector for Databricks manually, then give it Storage Blob Data Contributor in the Storage Account (Azure).  Now, when you create a Databricks Workspace, it automatically creates the Access Connector (I have a related gripe with the way it names them - if you have multiple workspaces, theres no easy way to tell which is for which without hovering over it to see the full description).  We assumed this could be used and was attached to the Databricks workspace.  We made sure that this auto-generate Access Connector had Storage Blob Data Contributor to the new Storage Account.  

When I tried to create the table with a defined EXTERNAL LOCATION, I got an error message that said "The credential <credential_name> is a workspace default credential that is only allowed to access data in the following paths <default_path>."

This made me realize that the Access Connectors that Databricks is creating are not allowed to be used on other Storage Accounts.  This is pretty odd to me, but it seems to be the case.

When I created a new Access Connector, tied to this same Databricks Workspace, then gave it Storage Blob Data Contributor, everything works as expected.

Thanks!
Seth Parker

View solution in original post

2 REPLIES 2

BigRoux
Databricks Employee
Databricks Employee

Although not formal support, here are some things to consider as you troubleshoot the problem:

This issue sounds like it stems from a Unity Catalog external location configuration mismatch or permission issue, despite the external location itself being reachable. Here’s a structured checklist to help you troubleshoot the [TABLE_DOES_NOT_EXIST.RESOURCE_DOES_NOT_EXIST] error when creating a table:

 

 

Things You’ve Already Verified

  • Workspace is reachable and accessible

  • External location is created and points to a valid container

  • Networking on the Storage Account is open to all endpoints

  • Volume creation on the same location works (implying access is technically fine)

  • Table creation without an external location works

🔍

Troubleshooting Checklist

1. Check External Location and Storage Credential Mapping

    • The storage credential is of the correct type (e.g., managed identity or service principal).

    • It has permission (Storage Blob Data Contributor) on the container and the parent storage account.

      Make sure the external location is correctly mapped to a valid storage credential, and that:

  • Run this SQL in a Databricks notebook in the new workspace to inspect:

 

DESCRIBE EXTERNAL LOCATION `<your_external_location_name>`;

 

2. Catalog Permissions

  • Ensure that your Unity Catalog catalog is correctly referencing the external location:

 

DESCRIBE CATALOG `<your_catalog_name>`;
  •  

    • USE CATALOG <catalog>

    • CREATE TABLE privilege on the catalog or schema level

      Also check if the user (or the group they belong to) has both of these:

3. Check the Metastore Assignment

    • Go to Admin Console → Unity Catalog → Metastore assignments

    • Make sure the new workspace is correctly assigned and not in an inconsistent state

      Confirm the workspace is attached to the correct Unity Catalog metastore:

4. Table Creation Command Structure

  • If you are doing a CREATE TABLE (non-managed), be sure you are not mixing paths or missing specifications. This can trip up UC-managed external catalogs.

  • Try creating the table with an explicit path override just to test:

 

CREATE TABLE my_catalog.my_schema.my_table (id INT)
LOCATION 'abfss://<container>@<account>.dfs.core.windows.net/<some_subpath>';

 

5. Schema Directory Issue

    • Manually create the schema directory, or try:

      If the schema (aka “database”) directory does not yet exist in the storage account (under /catalog_name/schema_name/), Databricks may fail silently or with confusing errors.

 

CREATE SCHEMA my_catalog.my_schema;

 

6. Unity Catalog Table Path Validation

    • Try checking the table metadata in UC using:

      Unity Catalog uses a specific metadata path structure. If the UC metastore cannot reconcile the metadata with the storage path, it may show RESOURCE_DOES_NOT_EXIST.

 

SHOW TABLES IN my_catalog.my_schema;

 

🛠️ Additional Recommendations

  • Try recreating the external location using the Databricks UI and make sure it’s validated.

  • Double check the workspace identity (Managed Identity or Service Principal) is the one configured in the storage credential and has access to the container.

  • If all else fails: Try creating the same setup in a test workspace with minimal security configs to rule out weird propagation or policy issues.

 

🧩 Root Cause Theories

Based on your description, here are the likely culprits:

  • Unity Catalog is unable to register or recognize the metadata for the new table due to a missing schema directory or improper mapping.

  • The external location credential identity has storage access but not metastore access.

  • The workspace might not have fully propagated Unity Catalog setup after being added to the metastore.

 

Hope this sets you in the right direction.

Cheers, Lou.

QLA_SethParker
New Contributor III

Hi Lou,

Thank you so much for your detailed reply, and I apologize for leaving this open for so long.  I got wrapped up in another project and am just getting back to this.

I was able to resolve it, at least in my situation, last night, so I wanted to add what I found to this thread in case it helps someone else.

It used to be the case that you had to create the Access Connector for Databricks manually, then give it Storage Blob Data Contributor in the Storage Account (Azure).  Now, when you create a Databricks Workspace, it automatically creates the Access Connector (I have a related gripe with the way it names them - if you have multiple workspaces, theres no easy way to tell which is for which without hovering over it to see the full description).  We assumed this could be used and was attached to the Databricks workspace.  We made sure that this auto-generate Access Connector had Storage Blob Data Contributor to the new Storage Account.  

When I tried to create the table with a defined EXTERNAL LOCATION, I got an error message that said "The credential <credential_name> is a workspace default credential that is only allowed to access data in the following paths <default_path>."

This made me realize that the Access Connectors that Databricks is creating are not allowed to be used on other Storage Accounts.  This is pretty odd to me, but it seems to be the case.

When I created a new Access Connector, tied to this same Databricks Workspace, then gave it Storage Blob Data Contributor, everything works as expected.

Thanks!
Seth Parker

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now