How to Exclude or Overwrite Specific JARs in Databricks Jars
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 06:36 AM
Spark Version in Databricks 15.5 LTS: The runtime includes Apache Spark 3.5.x, which defines the SparkListenerApplicationEnd constructor as:
public SparkListenerApplicationEnd(long time)
This constructor takes a single long parameter.
Conflicting Spark Library in Databricks: The error arises due to a conflicting library: ----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar. This library includes a different version of the SparkListenerApplicationEnd class, which defines the constructor as:
public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
This is method is present Spark 4.0.0-preview2 version.
Impact: At runtime, the JVM attempts to use the single-parameter constructor (<init>(J)V) but fails because the conflicting library expects the two-parameter version. This mismatch leads to the NoSuchMethodError.
How can I exclude or overwrite a specific JAR file, ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar, from Databricks compute dependencies when launching a Databricks compute cluster.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 06:41 AM
Hi @sahil_s_jain,
How are you installing the jar file in the cluster?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 07:13 AM
I am creating a uber jar of my application with spark 3.5.0 spark depedencies and using jar submit on the cluster for execution.
As the spark libraries from the above jar "----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar" are getting loaded.
the constructor SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode) is creating a conflict.
In my code i have used a single argument constructor which is correct as per spark 3.5.0 but the loaded class from the above Databricks jar is expecting a constructor of 2 arguments which is not correct as per spark version 3.5.0
Constructor in spark 3.5.0
public SparkListenerApplicationEnd(long time)
Constructor in "----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar"
public SparkListenerApplicationEnd(long time, scala.Option<Object> exitCode)
Need you inputs on this conflict, as the class "SparkListenerApplicationEnd" packed in "----ws_3_5--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar" is not same as the spark 3.5.0.

