Sunday
I am attempting to upgrade my application from Databricks runtime version 12.2 LTS to 15.5 LTS. During this upgrade, my Spark job fails with the following error:
java.lang.NoSuchMethodError: org.apache.spark.scheduler.SparkListenerApplicationEnd.<init>(J)V
Spark Version in Databricks 15.5 LTS: The runtime includes Apache Spark 3.5.x, which defines the SparkListenerApplicationEnd constructor as:
public SparkListenerApplicationEnd(long time)
This constructor takes a single long parameter.
Conflicting Spark Library in Databricks: The error arises due to a conflicting library: ----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar. This library includes a different version of the SparkListenerApplicationEnd class, which defines the constructor as:
public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
This is method is present Spark 4.0.0-preview2 version.
Impact: At runtime, the JVM attempts to use the single-parameter constructor (<init>(J)V) but fails because the conflicting library expects the two-parameter version. This mismatch leads to the NoSuchMethodError.
Thank you in advance for your support!
yesterday
The error you are encountering seems to be related to the runtime includes Apache Spark 3.5.x, which defines the constructor as public SparkListenerApplicationEnd(long time)
, while a conflicting library (----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar
) expects a different version of the constructor: public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
.
This issue occurs because the JVM attempts to use the single-parameter constructor but fails due to the conflicting library expecting the two-parameter version, leading to the NoSuchMethodError
.
To resolve this issue, you can try the following steps:
Identify and Remove Conflicting Libraries: Check your dependencies and remove or update the conflicting library that includes the SparkListenerApplicationEnd class with the two-parameter constructor. Ensure that all libraries are compatible with Apache Spark 3.5.x.
Update Dependencies: Ensure that all your project dependencies are updated to versions compatible with Databricks runtime 15.5 LTS and Apache Spark 3.5.x.
yesterday
The issue is because Databricks 15.4 LTS includes ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar library which is not compatible with Spark 3.5.x version. Spark 3.5.x version contains single argument SparkListenerApplicationEnd constructor.
Databrick 15.4 LTS includes Spark 3.5.x and ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar library should be compatible with Spark 3.5.x. But it is not.
8 hours ago
I am trying to initialize class org.apache.spark.scheduler.SparkListenerApplicationEnd with databricks 15.4LTS.
Spark 3.5.0 expects a single argument constructor for org.apache.spark.scheduler.SparkListenerApplicationEnd(long time)
Whereas the class packaged in Databricks jar "----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar" expects a 2 argument constructor i.e.
org.apache.spark.scheduler.SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
This 2 argument constructor is in line with Spark 4.0.0-preview2 version and NOT IN spark version 3.5.0
This is causing a conflict, can you please check this version issue in the Databricks cluster binaries.
4 hours ago
I can attest to this being the case as well. I ran into this issue trying to implement and updated form of the
4 hours ago
@DBonomo , did you find any workaround for this?
2 hours ago
No I am currently downgrading to an older DBR (13.3) and running these jobs specifically on that version. That brings it's own suite of problems though.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group