04-26-2024 08:51 AM
04-27-2024 09:00 AM
Hello Mounika, many thanks for your question, are you using a shared access cluster? If yes, shared clusters requires you to grant Select permission on Any file to be able to access DBFS as mentioned on this doc https://docs.databricks.com/en/data-governance/table-acls/any-file.html#how-does-any-file-interact-w...
You can grant this permission by running https://kb.databricks.com/en_US/data/user-does-not-have-permission-select-on-any-file
Another solution will be to use single user cluster which does not requires it.
04-30-2024 11:55 PM
Hi @MOUNIKASIMHADRI ,
Workspace admins get ANY FILE granted by default. They can explicitly grant it to non-admin users.
Hence as suggested in the kb,
GRANT SELECT ON ANY FILE TO `<user@domain-name>`
12-03-2024 06:52 AM
Hi, I am having the same issue. The Databricks extension is well installed and configured, and my user has enough permissions as I have been working without issues the whole time, but now when I run my notebooks to read tables in the same databricks catalog this error comes up the whole time, along with 'no module named dbruntime'.
I have tried reinstalling the extension and the environment, and changing the profile and some settings but still doesn't work.
12-04-2024 01:31 AM
Hi @mpalacio ,
For no module named 'dbruntime'
Are you using the dbutils in any code that runs on a Spark worker node.
If yes then it will throw the above error because this is an expected behaviour that we cannot use dbutils in any code that runs on a Spark worker node since there are no user credentials on it.
All the UDF and Spark transforms (filter, map, etc) runs on the worker node.
12-04-2024 01:31 AM
Please refer to some of the other community articles with the no module error https://community.databricks.com/t5/data-engineering/udf-importing-from-other-modules/td-p/58988
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now