cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission MODIFY on any file

rachelh
Visitor

Just wondering if anyone could help me understand why we are hitting this error: `[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission MODIFY on any file`

A job is trying to create a table with an external location (already setup) using serverless compute. The job was able to do this fine on a standard cluster but once we switched to serverless, we encountered this error. The job is also being "run as" a service principal that has "CREATE EXTERNAL TABLE",  "READ FILES" and "WRITE FILES" on the external location. It appears as though the serverless compute appears to be missing these permissions and falling back to table access control, hence looking for "any file" permissions; can someone help me confirm this? I would like to know why the Unity Catalog permissions are not being honored in this case.

3 REPLIES 3

Khaja_Zaffer
Contributor III

Hello @rachelh 

Good day!!

serverless compute enforces stricter rules: serverless can't bypass UC for external storage and defaults to an insufficient privilege check under the hood.
 
The key missing permission is the CREATE TABLE privilege on the schema where you are trying to create the table, along with the correct permissions on the external location. More specifically, the service principal needs to have: CREATE TABLE privilege on the target schema. USAGE privilege on the catalog and schema.
 
 

BS_THE_ANALYST
Esteemed Contributor II

saurabh18cs
Honored Contributor II

Hi @rachelh 

As I understand , you need to look for azure access connector setup for your unity catalog because Serverless clusters run under a Azure Databricks-managed identity, not the service principal.

  • Access Connector (Azure Managed Identity): Used by Databricks serverless compute to access storage on behalf of Unity Catalog.

 

saurabh18cs_0-1760351638262.png

saurabh18cs_1-1760351883821.png

 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now