cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue: NoSuchMethodError in Spark Job While Upgrading to Databricks 15.5 LTS

sahil_s_jain
New Contributor II

Problem Description

I am attempting to upgrade my application from Databricks runtime version 12.2 LTS to 15.5 LTS. During this upgrade, my Spark job fails with the following error:

java.lang.NoSuchMethodError: org.apache.spark.scheduler.SparkListenerApplicationEnd.<init>(J)V

Root Cause Analysis

  • Spark Version in Databricks 15.5 LTS: The runtime includes Apache Spark 3.5.x, which defines the SparkListenerApplicationEnd constructor as:

    public SparkListenerApplicationEnd(long time)

    This constructor takes a single long parameter.

  • Conflicting Spark Library in Databricks: The error arises due to a conflicting library: ----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar. This library includes a different version of the SparkListenerApplicationEnd class, which defines the constructor as:

    public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)

    This is method is present Spark 4.0.0-preview2 version.

  • Impact: At runtime, the JVM attempts to use the single-parameter constructor (<init>(J)V) but fails because the conflicting library expects the two-parameter version. This mismatch leads to the NoSuchMethodError.

Thank you in advance for your support!

6 REPLIES 6

Walter_C
Databricks Employee
Databricks Employee

The error you are encountering seems to be related to the runtime includes Apache Spark 3.5.x, which defines the constructor as public SparkListenerApplicationEnd(long time), while a conflicting library (----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar) expects a different version of the constructor: public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode).

 

This issue occurs because the JVM attempts to use the single-parameter constructor but fails due to the conflicting library expecting the two-parameter version, leading to the NoSuchMethodError.

To resolve this issue, you can try the following steps:

  1. Identify and Remove Conflicting Libraries: Check your dependencies and remove or update the conflicting library that includes the SparkListenerApplicationEnd class with the two-parameter constructor. Ensure that all libraries are compatible with Apache Spark 3.5.x.

  2. Update Dependencies: Ensure that all your project dependencies are updated to versions compatible with Databricks runtime 15.5 LTS and Apache Spark 3.5.x.

sahil_s_jain
New Contributor II

The issue is because Databricks 15.4 LTS includes ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar library which is not compatible with Spark 3.5.x version. Spark 3.5.x version contains single argument SparkListenerApplicationEnd constructor.

 

Databrick 15.4 LTS includes Spark 3.5.x and ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar library should be compatible with Spark 3.5.x. But it is not.

sahil_s_jain
New Contributor II

I am trying to initialize class org.apache.spark.scheduler.SparkListenerApplicationEnd with databricks 15.4LTS.

Spark 3.5.0 expects a single argument constructor for org.apache.spark.scheduler.SparkListenerApplicationEnd(long time)

Whereas the class packaged in Databricks jar "----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar" expects a 2 argument constructor i.e.

org.apache.spark.scheduler.SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)

This 2 argument constructor is in line with Spark 4.0.0-preview2 version and NOT IN spark version 3.5.0

This is causing a conflict, can you please check this version issue in the Databricks cluster binaries.

DBonomo
Visitor

I can attest to this being the case as well. I ran into this issue trying to implement and updated form of the 

com.microsoft.sqlserver.jdbc.spark connector, and found that the implementation in DBR 15.4LTS is actually mapped to master (the current spark 4.0 working branch). 

DBonomo_0-1736264319353.pngDBonomo_1-1736264341599.png

You can reference the 3.5 implementation here compared to the master branch version here.




sahil_s_jain
New Contributor II

@DBonomo , did you find any workaround for this?

DBonomo
Visitor

No I am currently downgrading to an older DBR (13.3) and running these jobs specifically on that version. That brings it's own suite of problems though.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group