cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Permission error loading dataframe from azure unity catalog to GCS bucket

kiko_roy
Contributor

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the df or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values.

on running : 

df1.write.format("parquet").save("gs://dev-XXXX-analyt-XXXXXXXX") getting error :
Insufficient privileges: User does not have permission SELECT on any file.. what could be the reason or resolution. Need help

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @kiko_roy, The error message youโ€™re encountering, โ€œInsufficient privileges: User does not have permission SELECT on any file,โ€ indicates that your user account lacks the necessary permissions to read files.

 

Letโ€™s address this issue:

 

Cause:

  • Table access control is enabled on your Databricks cluster, and you are not an admin.
  • The Databricks SQL query analyzer enforces access control policies at runtime. When table access control is enabled, users must have specific permissions to access tables.

Solution:

  • Admins can bypass table access control, but regular users need explicit permissions.
  • An admin must grant SELECT permission on files so that you can create a table.
  • Run the following command in a notebook (as an admin):%sql GRANT SELECT ON ANY FILE TO <user@domain-name> Replace <user@domain-name> with your actual user identifier.

Warning:

  • Users granted access to ANY FILE can bypass restrictions on the catalog, schemas, tables, and views by reading directly from the filesystem.
  • Review the Data object privileges documentation for more information.

Remember to execute the command as an admin, and ensure that the necessary permissions are granted. This should resolve the issue youโ€™re facing.

 

If you encounter any further difficulties, feel free to ask for additional assistance! ๐Ÿš€

View solution in original post

5 REPLIES 5

Kaniz_Fatma
Community Manager
Community Manager

Hi @kiko_roy, The error message youโ€™re encountering, โ€œInsufficient privileges: User does not have permission SELECT on any file,โ€ indicates that your user account lacks the necessary permissions to read files.

 

Letโ€™s address this issue:

 

Cause:

  • Table access control is enabled on your Databricks cluster, and you are not an admin.
  • The Databricks SQL query analyzer enforces access control policies at runtime. When table access control is enabled, users must have specific permissions to access tables.

Solution:

  • Admins can bypass table access control, but regular users need explicit permissions.
  • An admin must grant SELECT permission on files so that you can create a table.
  • Run the following command in a notebook (as an admin):%sql GRANT SELECT ON ANY FILE TO <user@domain-name> Replace <user@domain-name> with your actual user identifier.

Warning:

  • Users granted access to ANY FILE can bypass restrictions on the catalog, schemas, tables, and views by reading directly from the filesystem.
  • Review the Data object privileges documentation for more information.

Remember to execute the command as an admin, and ensure that the necessary permissions are granted. This should resolve the issue youโ€™re facing.

 

If you encounter any further difficulties, feel free to ask for additional assistance! ๐Ÿš€

Thanks @Kaniz_Fatma . The solution did work !!

I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution.




 

ruloweb
New Contributor II

Hi, is there any terraform resource to apply this GRANT or this have to be done always manually?

Hi @ruloweb , apart from granting access manually on any File, one way that resolved the issue for me was to use a cluster with single user.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group