cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Trying to run databricks academy labs, but execution fails due to method to clearcache not whilelist

Brammer88
New Contributor III

Hi there,

Im trying to run DE 2.1 - Querying Files Directly on my workspace with a default cluster configuration for found below,

Brammer88_0-1713340930496.png

but I cannot seem to run this file (or any other labs) as it gives me this error message 

 

Resetting the learning environment:
The execution of this command did not finish successfully
 
Python interpreter will be restarted. Python interpreter will be restarted.
 
 
Resetting the learning environment:
 
py4j.security.Py4JSecurityException: Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl
 
Any pointers as to how i can fix this?Where do i need to go to whitelist this class and how would i do this? 
 
Thanks,
Bram

 

 

6 REPLIES 6

Kaniz_Fatma
Community Manager
Community Manager

Hi @Brammer88

  • There are a couple of ways to handle this:
    • Option 1: Disable Py4J security altogether (not recommended):
      • You can set spark.databricks.pyspark.enablePy4JSecurity to false. However, this option is not recommended due to security implications.
    • Option 2: Whitelist the necessary classes and methods:
      • Set the spark.jvm.class.allowlist configuration property in your Spark configuration.
      • Specify the fully qualified class names and methods that you want to allow.
      • For example, you can add org.apache.spark.sql.internal.CatalogImpl.clearCache to the allowlist.

Hi Kaniz,

Im not sure why, but it seems like its not working after all and I was cheering to soon.

I have added a line 

spark.jvm.class.allowlist org.apache.spark.sql.internal.CatalogImpl.clearCache

to my cluster in the spark config (under advanced options), but i still get the same response;

"The execution of this command did not finish successfully
 
Python interpreter will be restarted. Python interpreter will be restarted.
 
 
Resetting the learning environment:
py4j.security.Py4JSecurityException: Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl
Command skipped
Command skipped" 
 
Do you have any ideas as to what I am doing wrong?
 
Thanks,
Bram

 

Brammer88
New Contributor III

Hi @Kaniz_Fatma,

Do you have any other solution/idea as to what might cause the issue? Or do you need any more details for investigation? Thanks again for your help! I am completely stuck as I am unable to do any work with the labs in this way. 

Best,

Bram

 

Brammer88
New Contributor III

works, thanks for the quick response! 

Brammer88
New Contributor III

Hi @Kaniz,

Sorry, I was too fast with this reply, im not sure why, but it seems like its not working after all and I was cheering to soon.

I have added a line 

spark.jvm.class.allowlist org.apache.spark.sql.internal.CatalogImpl.clearCache

to my cluster in the spark config (under advanced options), but i still get the same response;

"The execution of this command did not finish successfully
 
Python interpreter will be restarted. Python interpreter will be restarted.
 
 
Resetting the learning environment:
py4j.security.Py4JSecurityException: Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl
Command skipped
Command skipped" 
 

Do you have any other solution/idea as to what might cause the issue? Or do you need any more details for investigation? Thanks again for your help! I am completely stuck as I am unable to do any work with the labs in this way. 

Best,

Bram

Brammer88
New Contributor III

Hi @Kaniz_Fatma and databricks team,

Did you already found some other solution for this? 

Thanks,

Bram

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group