cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Post upgrading the Azure databricks cluster from 8.3 (includes Apache Spark 3.1.1, Scala 2.12) to 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12), I am getting intermittent error.

Darshana_Ganesh
New Contributor II

The error is as below. The error is intermittent. eg. - The same code throws the below issue for run 3 but doesn't throws issue for run 4. Then again throws issue for run 5.

An error occurred while calling o1509.getCause. Trace:

py4j.security.Py4JSecurityException: Method public synchronized java.lang.Throwable org.datanucleus.exceptions.NucleusException.getCause() is not whitelisted on class class org.datanucleus.exceptions.NucleusDataStoreException

at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473)

at py4j.Gateway.invoke(Gateway.java:294)

at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)

at py4j.commands.CallCommand.execute(CallCommand.java:79)

at py4j.GatewayConnection.run(GatewayConnection.java:251)

at java.lang.Thread.run(Thread.java:748)

ErrorMessage written to ADLS Successfully..

org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.

1 ACCEPTED SOLUTION

Accepted Solutions

User16764241763
Honored Contributor

Hello @Darshana Ganeshโ€‹ 

Are you setting "spark.databricks.pyspark.enablePy4JSecurity": "true" config on Table ACL enabled clusters?

View solution in original post

4 REPLIES 4

Darshana_Ganesh
New Contributor II

On executing the below code in azure databricks notebook:

print (spark.version)

for 8.3 (includes Apache Spark 3.1.1, Scala 2.12) , I am getting the output as 3.1.0

and that for 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12), I am getting 3.1.2

Anonymous
Not applicable

Hello, @Darshana Ganeshโ€‹ - My name is Piper, and I'm a moderator for Databricks. Thank you for your question and the extra information. We'll give the community some time to answer before we circle back if we need to.

User16764241763
Honored Contributor

Hello @Darshana Ganeshโ€‹ 

Are you setting "spark.databricks.pyspark.enablePy4JSecurity": "true" config on Table ACL enabled clusters?

Anonymous
Not applicable

Hey @Darshana Ganeshโ€‹ 

Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.

Thanks!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group