04-19-2024 02:09 AM
Hi all,
I have recently enabled Unity catalog in my DBX workspace. I have created a new catalog with an external location on Azure data storage.
I can create new schemas(databases) in the new catalog but I can't create a table. I get the below error when trying to create a table:
"
Failed to acquire a SAS token for list on /__unitystorage/catalogs/d333fd28-cf81-4737-90f0-d164883a55ca/tables/58c77eb9-c058-447f-b3b9-58bd2760e381/_delta_log due to java.util.concurrent.ExecutionException: com.databricks.sql.managedcatalog.UnityCatalogServiceException: [RequestId=b2e77357-64ff-48ec-a85b-3eb1da8d6764 ErrorClass=TABLE_DOES_NOT_EXIST.RESOURCE_DOES_NOT_EXIST] Table '58c70eb9-c059-447f-b3b9-58bd2760e321' does not exist.
"
When I try to access the external location through DBX I get the below error:
"
"
Has anyone seen either these errors before and understands how to fix them?
Thanks,
Sean
04-21-2024 11:27 PM
@Snoonan
Serverless needs a little bit different setup than the normal clusters.
More info here: https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serv...
04-19-2024 02:20 AM
@Snoonan
Make sure that permissions are correct.
Databricks Access Connector requires at least:
- Blob Data Reader on storage,
- Blob Data Contributor on container
04-19-2024 03:05 AM
Hi @daniel_sahal ,
Thank you. I did not give access connector 'Blob Data Reader' on storage account.
Since doing this I now have the error:
"
This Azure storage request is not authorized. The storage account's 'Firewalls and virtual networks' settings may be blocking access to storage services. Please verify your Azure storage credentials or firewall exception settings.
"
What do you think needs to be done here? DBX notebooks have read from this storage account before.
04-19-2024 03:27 AM
@Snoonan
First of all, check the networking tab on the storage account to see if it's behind firewall. If it is, make sure that Databricks/Storage networking is properly configured (https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-inject)
04-19-2024 05:15 AM
I have checked this and the vnet is set up correctly to the best of my knowledge.
I can access the data using notebook using a unity catalog enabled cluster. For SQL editor I am using a serverless warehouse cluster.
I wonder how these differ such that one can access the data and the other can't?
04-21-2024 11:27 PM
@Snoonan
Serverless needs a little bit different setup than the normal clusters.
More info here: https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serv...
04-22-2024 12:01 AM
That worked. Thank you for your help.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group