cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.trees.Origin.<init>

ashutosh0710
New Contributor II

While trying to run 

 

spark.sql("CALL iceberg_catalog.system.expire_snapshots(table => 'iceberg_catalog.d11_stitch.rewards_third_party_impact_base_query',  older_than => TIMESTAMP '2024-03-06 00:00:00.000')")

 

to this Im getting 

 

Py4JJavaError: An error occurred while calling o415.sql.
: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.trees.Origin.<init>(Lscala/Option;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/Option;)V
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergParserUtils$.position(IcebergSqlExtensionsAstBuilder.scala:377)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergParserUtils$.withOrigin(IcebergSqlExtensionsAstBuilder.scala:367)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsAstBuilder.visitSingleStatement(IcebergSqlExtensionsAstBuilder.scala:334)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsAstBuilder.visitSingleStatement(IcebergSqlExtensionsAstBuilder.scala:66)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser$SingleStatementContext.accept(IcebergSqlExtensionsParser.java:153)
	at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:18)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergSparkSqlExtensionsParser.$anonfun$parsePlan$1(IcebergSparkSqlExtensionsParser.scala:124)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergSparkSqlExtensionsParser.parse(IcebergSparkSqlExtensionsParser.scala:178)
	at org.apache.spark.sql.catalyst.parser.extensions.IcebergSparkSqlExtensionsParser.parsePlan(IcebergSparkSqlExtensionsParser.scala:124)
	at com.databricks.sql.parser.DatabricksSqlParser.$anonfun$parsePlan$1(DatabricksSqlParser.scala:80)
	at com.databricks.sql.parser.DatabricksSqlParser.parse(DatabricksSqlParser.scala:101)
	at com.databricks.sql.parser.DatabricksSqlParser.parsePlan(DatabricksSqlParser.scala:77)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:892)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:452)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:891)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:890)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:924)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)
	at py4j.Gateway.invoke(Gateway.java:306)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.lang.Thread.run(Thread.java:750)
File <command-2355211326393001>, line 1
----> 1 spark.sql("CALL iceberg_catalog.system.expire_snapshots(table => 'iceberg_catalog.d11_stitch.rewards_third_party_impact_base_query',  older_than => TIMESTAMP '2024-03-06 00:00:00.000')")
File /databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
    324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325 if answer[1] == REFERENCE_TYPE:
--> 326     raise Py4JJavaError(
    327         "An error occurred while calling {0}{1}{2}.\n".
    328         format(target_id, ".", name), value)
    329 else:
    330     raise Py4JError(
    331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
    332         format(target_id, ".", name, value))

 

I tried using databricks Runtime 14.3 and 15.3 but getting the same issue on both the runtimes. 

I was able to run this successfully on 14.1 

Im adding the following jars 

iceberg-spark-runtime-3.5_2.12-1.5.0.jar
iceberg-aws-bundle-1.5.0.jar
4 REPLIES 4

VZLA
Databricks Employee
Databricks Employee

The error is likely due to incompatibility between the versions of the Iceberg JARs you are using and the Databricks Runtime versions. The NoSuchMethodError suggests that the method signature expected by the Iceberg library is not present in the Spark SQL Catalyst Trees library in the newer DBR versions. You'll need to use newer compatible versions of these two libraries.

ashutosh0710
New Contributor II

@VZLA I tried multiple combinations but nothing worked. If possible can you share the compatible libraries with DBR 14.3 and above?

VZLA
Databricks Employee
Databricks Employee

@ashutosh0710 you can inspect the libraries available in a cluster at /databricks/jars to understand which jars and versions are available, or similarly simply inspect the Spark UI Environment Classpath list.

ashutosh0710
New Contributor II

I checked the libraries in /databricks/jars , iceberg-spark-runtime-3.5_2.12-1.5.0.jar, iceberg-aws-bundle-1.5.0.jar seem to be compatible with the spark version there is.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group