cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to use dbutils in Premium

Jain
New Contributor III

I am unable to use dbutils commands and mkdir, etc also does not work after upgrading my Databricks Workspace from Standard tier to Premium tier.

It throws the following error:

py4j.security.Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel(org.apache.spark.SparkContext) is not whitelisted

Any suggestions please ?

Please let me know if you need more information

4 REPLIES 4

daniel_sahal
Esteemed Contributor

@Abhishek Jain​ 

It looks like someone had the same issue before.

https://community.databricks.com/s/question/0D53f00001OFuWLCA1/can-you-help-with-this-error-please-i...

Jain
New Contributor III

Hi @Daniel Sahal​ , Thank you for a prompt response.

I did have a look at the above.

However, the best commented response throws me this error.

AnalysisException: Cannot modify the value of a static config: spark.databricks.pyspark.enablePy4JSecurity

image

Anonymous
Not applicable

Hi @Abhishek Jain​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Jain
New Contributor III

Hi @Vidula Khanna​ ,

I observed that the Access Mode options available in Databricks by default have different privileges.

When I use Shared Access Mode then the external storage which is mounted is not available and hence dbutils, mkdir, etc commond does not work. However, Table Level Access RBAC works in this.

When I use No Isolation Shared Access Mode then the external storage which is mounted is available and I can use the cluster as required. However, Table Access RBAC does not work in this.

In order to have both available I guess I will have to make a Custom Policy.

For now, I have created 2 clusters with different access modes as a workaround.

I couldn't find any Databricks official documentation which speaks about this case ​hence had to Hit & Trial to figure out.

Let me know if I might have missed anything.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group