cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Post upgrading the Azure databricks cluster from 8.3 (includes Apache Spark 3.1.1, Scala 2.12) to 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12), I am getting intermittent error.

Darshana_Ganesh
New Contributor II

The error is as below. The error is intermittent. eg. - The same code throws the below issue for run 3 but doesn't throws issue for run 4. Then again throws issue for run 5.

An error occurred while calling o1509.getCause. Trace:

py4j.security.Py4JSecurityException: Method public synchronized java.lang.Throwable org.datanucleus.exceptions.NucleusException.getCause() is not whitelisted on class class org.datanucleus.exceptions.NucleusDataStoreException

at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473)

at py4j.Gateway.invoke(Gateway.java:294)

at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)

at py4j.commands.CallCommand.execute(CallCommand.java:79)

at py4j.GatewayConnection.run(GatewayConnection.java:251)

at java.lang.Thread.run(Thread.java:748)

ErrorMessage written to ADLS Successfully..

org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.

1 ACCEPTED SOLUTION

Accepted Solutions

User16764241763
Honored Contributor

Hello @Darshana Ganesh​ 

Are you setting "spark.databricks.pyspark.enablePy4JSecurity": "true" config on Table ACL enabled clusters?

View solution in original post

5 REPLIES 5

Darshana_Ganesh
New Contributor II

On executing the below code in azure databricks notebook:

print (spark.version)

for 8.3 (includes Apache Spark 3.1.1, Scala 2.12) , I am getting the output as 3.1.0

and that for 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12), I am getting 3.1.2

Hi @Darshana Ganesh​ , Can you please share the code stack here?

Anonymous
Not applicable

Hello, @Darshana Ganesh​ - My name is Piper, and I'm a moderator for Databricks. Thank you for your question and the extra information. We'll give the community some time to answer before we circle back if we need to.

User16764241763
Honored Contributor

Hello @Darshana Ganesh​ 

Are you setting "spark.databricks.pyspark.enablePy4JSecurity": "true" config on Table ACL enabled clusters?

Anonymous
Not applicable

Hey @Darshana Ganesh​ 

Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.