Databricks is packaging a old version of big-query jar(Databricks also repackaged and created a fat jar), and our application needs a latest jar. Now the latest jar depends on spark-bigquery-connector.properties file for a property scala.binary.version, this property is not available on old version of jars.
As databricks runtime is always loading the internal packaged jars first, its own property file is loaded too.
While Our code(The latest bigquery jar internal) looks for the specific property (scala.binary.version), its not available and app fails to start. Any suggestion to overcome this.
As a workaround,
1. Relocated the all the bigquery classes (We cant relocate the internally referenced properties files path )
2. Removed the class file from bigquery jar which does this property check and added our own class implementaion and bundled back to application fat jar
โ