cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Unity catalog issues

Snoonan
New Contributor II

Hi all,

I have recently enabled Unity catalog in my DBX workspace. I have created a new catalog with an external location on Azure data storage.

I can create new schemas(databases) in the new catalog but I can't create a table. I get the below error when trying to create a table:

 "

Failed to acquire a SAS token for list on /__unitystorage/catalogs/d333fd28-cf81-4737-90f0-d164883a55ca/tables/58c77eb9-c058-447f-b3b9-58bd2760e381/_delta_log due to java.util.concurrent.ExecutionException: com.databricks.sql.managedcatalog.UnityCatalogServiceException: [RequestId=b2e77357-64ff-48ec-a85b-3eb1da8d6764 ErrorClass=TABLE_DOES_NOT_EXIST.RESOURCE_DOES_NOT_EXIST] Table '58c70eb9-c059-447f-b3b9-58bd2760e321' does not exist.

"

When I try to access the external location through DBX I get the below error:

"

Error loading files.
Input path url 'abfss://dbx-xxx-hive-metastore-unity-catalog@yyyyydfs.core.windows.net/__unitystorage' overlaps with managed storage within 'ListFiles' call

"

Has anyone seen either these errors before and understands how to fix them?

Thanks,

Sean

 

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor
6 REPLIES 6

daniel_sahal
Esteemed Contributor

@Snoonan 
Make sure that permissions are correct.
Databricks Access Connector requires at least:
- Blob Data Reader on storage,
- Blob Data Contributor on container

Hi @daniel_sahal ,

Thank you. I did not give access connector 'Blob Data Reader' on storage account.

Since doing this I now have the error:

"

This Azure storage request is not authorized. The storage account's 'Firewalls and virtual networks' settings may be blocking access to storage services. Please verify your Azure storage credentials or firewall exception settings.

"

What do you think needs to be done here? DBX notebooks have read from this storage account before.

 

daniel_sahal
Esteemed Contributor

@Snoonan 
First of all, check the networking tab on the storage account to see if it's behind firewall. If it is, make sure that Databricks/Storage networking is properly configured (https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-inject)

@daniel_sahal ,

I have checked this and the vnet is set up correctly to the best of my knowledge.

I can access the data using notebook using a unity catalog enabled cluster. For SQL editor I am using a serverless warehouse cluster.

I wonder how these differ such that one can access the data and the other can't?

daniel_sahal
Esteemed Contributor

@Snoonan 
Serverless needs a little bit different setup than the normal clusters.
More info here: https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serv...

@daniel_sahal 

That worked. Thank you for your help.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.