โ06-03-2025 07:48 AM
Hi there,
we're running Scala jobs on Databricks and I was eager to finally upgrade to Scala 2.13. However, Databricks Connect 16.4.x doesn't handle Scala versioning, so all dependencies are tied to Scala 2.13. It's rather tedious to exclude all 2.12 dependencies. I'm also encountering the issue that the project wants to use Scala 2.13.16 (the latest version), but DBR 16.4 runs using Scala 2.13.10.
Will there be a Scala 2.13-specific release of Databricks Connect? Or is there a different way of running my project?
โ06-03-2025 12:47 PM
Here is some information to consider:
โ06-03-2025 10:31 PM
Hi Lou,
thanks for your detailed answer!
So far I'm trying option 2, but exchanging so many dependencies seems like a strange thing to do and feels like a g-ame of whack-a-mole as new build errors pop up. If the dependency situations gets resolved with DBR 17, it might be acceptable.
I guess I'll try option 3 as well and cross-build my project.
What do you mean with option 1? Not using Databricks Connect at all? In that case, do I just use normal Spark (3.5.2) dependencies to run my job on the cluster? Do I use Spark Connect's SparkSession or plain old SparkContext?
โ06-03-2025 12:47 PM
Here is some information to consider:
โ06-03-2025 10:31 PM
Hi Lou,
thanks for your detailed answer!
So far I'm trying option 2, but exchanging so many dependencies seems like a strange thing to do and feels like a g-ame of whack-a-mole as new build errors pop up. If the dependency situations gets resolved with DBR 17, it might be acceptable.
I guess I'll try option 3 as well and cross-build my project.
What do you mean with option 1? Not using Databricks Connect at all? In that case, do I just use normal Spark (3.5.2) dependencies to run my job on the cluster? Do I use Spark Connect's SparkSession or plain old SparkContext?
โ06-04-2025 04:16 AM
Runtime 17.0 is out in beta right now and I expect it to GA in the near future. Keep an eye out for the runtime release notes. Once they are released you will be able to see what's present and hopefully (fingers crossed) your dependency issue will be resolved. https://docs.databricks.com/aws/en/release-notes/runtime
โ06-04-2025 05:34 AM
I'm not sure we can migrate to 17 before it reaches LTS status, and there's also Spark 4.0 to migrate to. So I'd like to just use Scala 2.13 on 16.4 for now. It seems that getting rid of Databricks Connect solves my dependency issues and also issues with Databricks overrides of Spark and Delta APIs.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now