โ05-18-2023 03:49 AM
I am unable to use dbutils commands and mkdir, etc also does not work after upgrading my Databricks Workspace from Standard tier to Premium tier.
It throws the following error:
py4j.security.Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel(org.apache.spark.SparkContext) is not whitelisted
Any suggestions please ?
Please let me know if you need more information
โ05-18-2023 06:32 AM
@Abhishek Jainโ
It looks like someone had the same issue before.
โ05-18-2023 06:46 AM
โ06-21-2023 12:05 AM
Hi @Abhishek Jainโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
โ06-21-2023 12:57 AM
Hi @Vidula Khannaโ ,
โ
I observed that the Access Mode options available in Databricks by default have different privileges.
When I use Shared Access Mode then the external storage which is mounted is not available and hence dbutils, mkdir, etc commond does not work. However, Table Level Access RBAC works in this.
When I use No Isolation Shared Access Mode then the external storage which is mounted is available and I can use the cluster as required. However, Table Access RBAC does not work in this.
โ
In order to have both available I guess I will have to make a Custom Policy.
For now, I have created 2 clusters with different access modes as a workaround.
I couldn't find any Databricks official documentation which speaks about this case โhence had to Hit & Trial to figure out.
Let me know if I might have missed anything.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now