I'm running into an exception when trying to run a Java Spark Jar using the delta-spark library as a job on a Databricks Runtime 16.4 LTS cluster on Azure.
I've tried various versions of the Delta Spark library from 3.0.0 to the latest 3.3.1, but always get the same error. I'm using Scala version 2.13. They all run fine locally using Spark 3.5.2.
I've searched the documentation and knowledge base and have been unable to find anything related to the exception.
Here is the exception
org.apache.spark.SparkException: java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider org.apache.spark.sql.delta.sources.DeltaDataSource could not be instantiated Caused by: org.apache.spark.SparkException: java.lang.NoSuchMethodError: 'void org.apache.spark.sql.sources.CreatableRelationProviderShim.$init$(org.apache.spark.sql.sources.CreatableRelationProviderShim)'