Unable to use dbutils in Premium
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-18-2023 03:49 AM
I am unable to use dbutils commands and mkdir, etc also does not work after upgrading my Databricks Workspace from Standard tier to Premium tier.
It throws the following error:
py4j.security.Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel(org.apache.spark.SparkContext) is not whitelisted
Any suggestions please ?
Please let me know if you need more information
- Labels:
-
Databricks Premium
-
Dbutils
-
Standard
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-18-2023 06:32 AM
@Abhishek Jain
It looks like someone had the same issue before.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-18-2023 06:46 AM
Hi @Daniel Sahal , Thank you for a prompt response.
I did have a look at the above.
However, the best commented response throws me this error.
AnalysisException: Cannot modify the value of a static config: spark.databricks.pyspark.enablePy4JSecurity
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2023 12:05 AM
Hi @Abhishek Jain
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2023 12:57 AM
Hi @Vidula Khanna ,
I observed that the Access Mode options available in Databricks by default have different privileges.
When I use Shared Access Mode then the external storage which is mounted is not available and hence dbutils, mkdir, etc commond does not work. However, Table Level Access RBAC works in this.
When I use No Isolation Shared Access Mode then the external storage which is mounted is available and I can use the cluster as required. However, Table Access RBAC does not work in this.
In order to have both available I guess I will have to make a Custom Policy.
For now, I have created 2 clusters with different access modes as a workaround.
I couldn't find any Databricks official documentation which speaks about this case hence had to Hit & Trial to figure out.
Let me know if I might have missed anything.

