4 weeks ago
Just wondering if anyone could help me understand why we are hitting this error: `[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission MODIFY on any file`
A job is trying to create a table with an external location (already setup) using serverless compute. The job was able to do this fine on a standard cluster but once we switched to serverless, we encountered this error. The job is also being "run as" a service principal that has "CREATE EXTERNAL TABLE", "READ FILES" and "WRITE FILES" on the external location. It appears as though the serverless compute appears to be missing these permissions and falling back to table access control, hence looking for "any file" permissions; can someone help me confirm this? I would like to know why the Unity Catalog permissions are not being honored in this case.
4 weeks ago
Hello @rachelh
Good day!!
4 weeks ago
Hmmm, I do have those permissions assigned to the schema already and appropriate permissions on the external location as stated in the original post.
4 weeks ago - last edited 4 weeks ago
@rachelh this is a nice article around external locations and Unity Catalog: https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/#how-does-unity-catalog-gover... & https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/#external-locations ๐
All the best,
BS
4 weeks ago
Hi @rachelh
As I understand , you need to look for azure access connector setup for your unity catalog because Serverless clusters run under a Azure Databricks-managed identity, not the service principal.
4 weeks ago
Sorry, I forgot to mention that we are using AWS as our cloud provider and grant access to S3 using external locations and storage credentials. Your solution may not be relevant to us?
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now