- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
I can't access tables and volumes from the Unity Catalog using a cluster in Azure Databricks, although it works with serverless. Why is this the case?
※As for the cluster, the summary displayed the "UnityCatalog" UC badge, and the access mode (data_security_mode) is set to "SINGLE_USER".
I executed the following code within a notebook in the cluster and the error"AuthorizationFailure" was returned.
```
%sql
select count(*) from catalogName.schemaName.tableName
```
```
dbutils.fs.ls('/Volumes/catalogName/schemaName/volumeName')
```
error detail
```
AbfsRestOperationException: Operation failed: "This request is not authorized to perform this operation.", 403, GET, https://adls_path, AuthorizationFailure.
```
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi Alex,
How are you doing today? As per my understanding, It looks like your Azure Databricks cluster doesn't have the right permissions to access Unity Catalog tables and volumes, even though it works with serverless. This could be because serverless clusters automatically handle identity passthrough, while your SINGLE_USER cluster might not have the necessary permissions. First, check if the user assigned to the cluster has SELECT access on the table and READ access on the volume. Also, verify that the cluster is set up for identity passthrough or has the right Azure storage permissions (like RBAC for ADLS). You can check your table and volume permissions using SHOW GRANTS, and if needed, an admin can grant the required access. If the issue persists, try using a serverless or Unity Catalog-enabled shared cluster to see if it's related to your cluster settings.
Regards,
Brahma
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Friday
Thank you for your response.
According to Microsoft Support, the connection is blocked because ADLS does not have a rule allowing traffic from Databricks' managed VNet.
For serverless, the connection works via a Private Endpoint.
It might be solved by using VNet peering or setting up a new environment with a VNet injection architecture.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi Alex,
How are you doing today? As per my understanding, It looks like your Azure Databricks cluster doesn't have the right permissions to access Unity Catalog tables and volumes, even though it works with serverless. This could be because serverless clusters automatically handle identity passthrough, while your SINGLE_USER cluster might not have the necessary permissions. First, check if the user assigned to the cluster has SELECT access on the table and READ access on the volume. Also, verify that the cluster is set up for identity passthrough or has the right Azure storage permissions (like RBAC for ADLS). You can check your table and volume permissions using SHOW GRANTS, and if needed, an admin can grant the required access. If the issue persists, try using a serverless or Unity Catalog-enabled shared cluster to see if it's related to your cluster settings.
Regards,
Brahma
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Friday
Thank you for your response.
According to Microsoft Support, the connection is blocked because ADLS does not have a rule allowing traffic from Databricks' managed VNet.
For serverless, the connection works via a Private Endpoint.
It might be solved by using VNet peering or setting up a new environment with a VNet injection architecture.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Friday
Thanks for the updates, Alex. Good day.

