- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2025 09:33 PM
- Labels:
-
Delta Lake
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-05-2025 06:41 AM
Hi @bohemiaRDX , Greetings!
Generally, this error would occur if the path is not added as an external location with storage credentials. Here the cluster could be trying to access the storage which doesn’t have UC storage credentials set nor any non UC access mechanisms like access keys/service principal setup and eventually failing with the given error.
What’s Next:
- Verify if the given storage location needs to be accessed via unity catalog or not.
- If the usage is for unity catalog,
- ensure that storage credentials and external location are set for the path access. Refer to https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage#--connect-to-azure-data-lak... and https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-external-loc... for more information.
- For table access failures, Verify if the schema has been set to UC one. This error may happen if it uses hive metastore catalog (may happen if there is no explicitl catalog and schema set + default catalog points to hive_metastore). Below query is useful to find the current catalog and schema used.
- Select current_catalog(), current_schema()
- If the usage is non UC based, ensure that relevant access configuration are in place. It can be set using OAuth 2.0 with an Azure service principal/SAS token/Account keys. These are usually set at notebook level/cluster spark configuration via UI/init script. For SQL warehouse, these are set in SQL warehouse settings for data access configuration.
Refer to https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage#connect-to-azure-data-lake-... for more details.
Leave a like if this helps, followups are appreciated.
Kudos,
Ayushi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-05-2025 06:41 AM
Hi @bohemiaRDX , Greetings!
Generally, this error would occur if the path is not added as an external location with storage credentials. Here the cluster could be trying to access the storage which doesn’t have UC storage credentials set nor any non UC access mechanisms like access keys/service principal setup and eventually failing with the given error.
What’s Next:
- Verify if the given storage location needs to be accessed via unity catalog or not.
- If the usage is for unity catalog,
- ensure that storage credentials and external location are set for the path access. Refer to https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage#--connect-to-azure-data-lak... and https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-external-loc... for more information.
- For table access failures, Verify if the schema has been set to UC one. This error may happen if it uses hive metastore catalog (may happen if there is no explicitl catalog and schema set + default catalog points to hive_metastore). Below query is useful to find the current catalog and schema used.
- Select current_catalog(), current_schema()
- If the usage is non UC based, ensure that relevant access configuration are in place. It can be set using OAuth 2.0 with an Azure service principal/SAS token/Account keys. These are usually set at notebook level/cluster spark configuration via UI/init script. For SQL warehouse, these are set in SQL warehouse settings for data access configuration.
Refer to https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage#connect-to-azure-data-lake-... for more details.
Leave a like if this helps, followups are appreciated.
Kudos,
Ayushi

