cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks runtime and Java Runtime

Witold
Honored Contributor

The Databricks runtime is shipped with two Java Runtimes: JRE 8 and JRE 17. While the first one is used by default, you can use the environment variable JNAME to specify the other JRE: JNAME: zulu17-ca-amd64.

FWIW, AFAIK JNAME is available since DBR 10.

We've seen use cases, which benefit from switching the JRE. Which actually makes a lot of sense, if you see which major improvement were added in the latest JREs. Having this possibility in the DBR is great.

Spark itself supports these runtimes, and with Spark 4.0 we will have official support for Java 21, which is the current LTE release.

My questions would be:
* What is Databricks strategy on the Java Runtimes?
* What is the reason that Databricks still ships JRE 8 as default, while competitors like MS Fabric decided to use newer version by default.
* The new version of Spark (Spark 4.0) is just around the corner, and Databricks supports some of these new features already today (like PySpark DataSources or Variant type). Which is awesome! What about Java 21, when will we able to use this version in DBR?

4 REPLIES 4

AlexeyEgorov
New Contributor II

Thanks for your post. We learned it the hard way as some of the software we use (Apache Sedona) produced strange errors which were solved by the usage of JRE17. I wonder what disadvantages in performance in security there might be by staying on JRE8.

Witold
Honored Contributor

@AlexeyEgorovThis post is bit outdated, as, starting from Databricks Runtime 16, JDK 17 is the new default.

AlexeyEgorov
New Contributor II

@Witold Thanks for this hint! However, as we didn't switch to 16 already (which is pretty new?), we experienced those issues. I will let my team know.

catalyst
New Contributor II

@Witold Thanks for the original post here, Any luck with jdk-21 on DBR-17?
I'm using some java-17 features in the code alongside spark-4.0.0 which I wanted to run on DBR-17. Sadly the generic jname=zulu21-ca-amd64 did not work for me. I also tried other variations of valid builds such as zulu21.34.19-ca-jdk21.0.3-linux_amd64.
However, none of these options (added to the Advanced options section) seemed to help.
I only found the default zulu17-ca-amd64 appear, when I ran an ls on /usr/lib/jvm each time.

Wishing the jdk support on the Databricks-Runtimes would be in-line with what the open-source Spark releases support. (in this case 21).
Are there any workarounds that you might have to suggest here?