Issue: NoSuchMethodError in Spark Job While Upgrading to Databricks 15.5 LTS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-05-2025 09:59 PM
Problem Description
I am attempting to upgrade my application from Databricks runtime version 12.2 LTS to 15.5 LTS. During this upgrade, my Spark job fails with the following error:
java.lang.NoSuchMethodError: org.apache.spark.scheduler.SparkListenerApplicationEnd.<init>(J)V
Root Cause Analysis
Spark Version in Databricks 15.5 LTS: The runtime includes Apache Spark 3.5.x, which defines the SparkListenerApplicationEnd constructor as:
public SparkListenerApplicationEnd(long time)
This constructor takes a single long parameter.
Conflicting Spark Library in Databricks: The error arises due to a conflicting library: ----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar. This library includes a different version of the SparkListenerApplicationEnd class, which defines the constructor as:
public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
This is method is present Spark 4.0.0-preview2 version.
Impact: At runtime, the JVM attempts to use the single-parameter constructor (<init>(J)V) but fails because the conflicting library expects the two-parameter version. This mismatch leads to the NoSuchMethodError.
Thank you in advance for your support!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-06-2025 04:09 AM
The error you are encountering seems to be related to the runtime includes Apache Spark 3.5.x, which defines the constructor as public SparkListenerApplicationEnd(long time)
, while a conflicting library (----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar
) expects a different version of the constructor: public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
.
This issue occurs because the JVM attempts to use the single-parameter constructor but fails due to the conflicting library expecting the two-parameter version, leading to the NoSuchMethodError
.
To resolve this issue, you can try the following steps:
-
Identify and Remove Conflicting Libraries: Check your dependencies and remove or update the conflicting library that includes the SparkListenerApplicationEnd class with the two-parameter constructor. Ensure that all libraries are compatible with Apache Spark 3.5.x.
-
Update Dependencies: Ensure that all your project dependencies are updated to versions compatible with Databricks runtime 15.5 LTS and Apache Spark 3.5.x.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-06-2025 04:19 AM
The issue is because Databricks 15.4 LTS includes ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar library which is not compatible with Spark 3.5.x version. Spark 3.5.x version contains single argument SparkListenerApplicationEnd constructor.
Databrick 15.4 LTS includes Spark 3.5.x and ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar library should be compatible with Spark 3.5.x. But it is not.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 04:03 AM
I am trying to initialize class org.apache.spark.scheduler.SparkListenerApplicationEnd with databricks 15.4LTS.
Spark 3.5.0 expects a single argument constructor for org.apache.spark.scheduler.SparkListenerApplicationEnd(long time)
Whereas the class packaged in Databricks jar "----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar" expects a 2 argument constructor i.e.
org.apache.spark.scheduler.SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
This 2 argument constructor is in line with Spark 4.0.0-preview2 version and NOT IN spark version 3.5.0
This is causing a conflict, can you please check this version issue in the Databricks cluster binaries.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 07:39 AM
I can attest to this being the case as well. I ran into this issue trying to implement and updated form of the
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 07:49 AM
@DBonomo , did you find any workaround for this?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 09:20 AM
No I am currently downgrading to an older DBR (13.3) and running these jobs specifically on that version. That brings it's own suite of problems though.

