cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

How does a non-admin user read a public s3 bucket on serverless?

spd_dat
New Contributor II

As an admin, I can easily read a public s3 bucket from serverless:

spark.read.parquet("s3://[public bucket]/[path]").display()

So can a non-admin user, from classic compute.

 
But why does a non-admin user, from serverless (both environments 1 & 2) get the following:
[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any file. SQLSTATE: 42501
(Again, it's a public bucket.)
1 ACCEPTED SOLUTION

Accepted Solutions

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @spd_dat,

Is the S3 bucket in the same region as your workspace? It might required using a IAM role / S3 bucket to allow the bucket even if it is public.

Just for a test can you try giving the user who is trying the below permission:

GRANT SELECT ON ANY FILE TO `<user@domain-name>`;

View solution in original post

2 REPLIES 2

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @spd_dat,

Is the S3 bucket in the same region as your workspace? It might required using a IAM role / S3 bucket to allow the bucket even if it is public.

Just for a test can you try giving the user who is trying the below permission:

GRANT SELECT ON ANY FILE TO `<user@domain-name>`;

Thanks Alberto,

Yes granting solves it -- I was initially worried that that would mean overly broad permissions (as the warning box states here) but I guess it is moderately comforting to read:

Privileges on the ANY FILE securable cannot override Unity Catalog privileges and do not grant or expand privileges on data objects governed by Unity Catalog. Some drivers and custom-installed libraries might compromise user isolation by storing data of all users in one common temp directory.
https://docs.databricks.com/aws/en/data-governance/table-acls/any-file#privileges-for-any-file

In any case, another workaround remains for non-admin users to use classic compute for this.

(It is not in the same region, but I did not worry too much about region as they can read via classic already..)

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now