cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to create a Managed table in Unity Catalog defaul location

Kris2
New Contributor II

We have setup the metastore with Manged Identity and when trying to create a managed table in the default location I am hitting below error. The storage is ADLS Gen2.

 


AbfsRestOperationException: Operation failed: "This request is not authorized to perform this operation.", 403, GET, https://xxxxxxx(masked intentionally).dfs.core.windows.net/unity-catalog?upn=false&resource=filesystem&maxResults=5000&directory=xxxx((masked intentionally))/tables/3b26d0dd-43b9-45cf-afb1-3d4d83246851/_delta_log&continuation=NTU4NjA5OTE0NzQwMTQ1MDUyNyAwIDAwMDAwMDAwMDAwMDAwMDAwMDA=&timeout=90&recursive=false&st=2023-12-11T16:31:06Z&sv=2020-02-10&ske=2023-12-11T18:31:06Z&sig=XXXXX&sktid=e9aef9b7-25ca-4518-a881-33e546773136&se=2023-12-11T17:51:09Z&sdd=3&skoid=b9083a16-f310-4656XXXXXXXXXXXXXXXXXX&spr=https&sks=b&skt=2023-12-11T16:31:06Z&sp=rl&skv=2020-02-10&sr=d, AuthorizationFailure, "This request is not authorized to perform this operation. RequestId:

1 ACCEPTED SOLUTION

Accepted Solutions

Ayushi_Suthar
Databricks Employee
Databricks Employee

Hi @Kris2 , I completely understand your hesitation and appreciate your approach to seeking guidance! 

This error generally means that the cluster has connectivity to Unity Catalog configured storage location and it is not authorized to access storage. It could happen because of the following reasons.

  1. Access connector does not have right role or
  2. Storage firewall issues or
  3. Wrong storage credentials/spark azure keys being used.

You can check the below details to resolve this error: 

Leave a like if this helps, followups are appreciated.

Kudos,

Ayushi

View solution in original post

2 REPLIES 2

Ayushi_Suthar
Databricks Employee
Databricks Employee

Hi @Kris2 , I completely understand your hesitation and appreciate your approach to seeking guidance! 

This error generally means that the cluster has connectivity to Unity Catalog configured storage location and it is not authorized to access storage. It could happen because of the following reasons.

  1. Access connector does not have right role or
  2. Storage firewall issues or
  3. Wrong storage credentials/spark azure keys being used.

You can check the below details to resolve this error: 

Leave a like if this helps, followups are appreciated.

Kudos,

Ayushi

bjosh1
New Contributor II

Hi,

I am facing the same issue.  I am getting 403 error while creating a TABLE.

The folder is getting created inside the metastore in the ADLS everytime I run the Create Table command.

The permissions are  `Storage Blob Data Contributor` role assigned at the storage account level.

At a container level the permissions are inherited.

At a Container Level I have also added a role of  `Storage Blob Data Owner`

I have also tried to add ACLs at further folder levels with the Access Connector for Azure DatabricksI have added Databricks Resource Instances for permissions with the access Connector name at the  at Storage account networking tab.

I have ticked Allow Trusted Microsoft Services to access at Storage account networking tab

I can create the Catalog and schema, table creation is getting error as it writes the data in the ADLS.

I have gone through the above documents.