I'm using StarRocks Connector[2] to ingest data to StarRocks on DataBricks 13.1 (powered by Spark 3.4.0). The connector could run on community Spark 3.4, but fail on the DBR. The reason is (the full stack trace is attached)
java.lang.IncompatibleClassChangeError: Conflicting default methods: org/apache/spark/sql/connector/write/BatchWrite.useCommitCoordinator org/apache/spark/sql/connector/write/streaming/StreamingWrite.useCommitCoordinator
at com.starrocks.connector.spark.sql.write.StarRocksWrite.useCommitCoordinator(StarRocksWrite.java)
at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.writeWithV2(WriteToDataSourceV2Exec.scala:431)
at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.writeWithV2$(WriteToDataSourceV2Exec.scala:416)
StarRockWrite implements both interface `BatchWrite` and `StreamingWrite`, and does not override default method `useCommitCoordinator` because only `BatchWrite` has the method in Spark 3.4. From the release note of DBR 13.1, I find it introduces an improvement https://issues.apache.org/jira/browse/SPARK-42968 which also adds the default method `useCommitCoordinator` to `StreamingWrite`, and there **bleep** be default method conflict if StarRockWrite does not override the method. The improvement is introduced since Spark 3.5, but DBR 13.1 (powered by Spark 3.4.0) introduces it in advance, and that'**bleep** why the connector works well on community spark, and fail on DBR.
Is it a compatibility issue between community spark and DBR? What is the right way to fix it?
[1] DBR 13.1 release note
[2] StarRocks Connector