cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks runtime and Java Runtime

Witold
Contributor III

The Databricks runtime is shipped with two Java Runtimes: JRE 8 and JRE 17. While the first one is used by default, you can use the environment variable JNAME to specify the other JRE: JNAME: zulu17-ca-amd64.

FWIW, AFAIK JNAME is available since DBR 10.

We've seen use cases, which benefit from switching the JRE. Which actually makes a lot of sense, if you see which major improvement were added in the latest JREs. Having this possibility in the DBR is great.

Spark itself supports these runtimes, and with Spark 4.0 we will have official support for Java 21, which is the current LTE release.

My questions would be:
* What is Databricks strategy on the Java Runtimes?
* What is the reason that Databricks still ships JRE 8 as default, while competitors like MS Fabric decided to use newer version by default.
* The new version of Spark (Spark 4.0) is just around the corner, and Databricks supports some of these new features already today (like PySpark DataSources or Variant type). Which is awesome! What about Java 21, when will we able to use this version in DBR?

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @Witold

  1. Databricks Strategy on Java Runtimes: Databricks provides flexibility by shipping both JRE 8 and JRE 17 with the Databricks runtime. While JRE 8 is the default, you can specify the other JRE using the environment variable JNAME.

  2. Reason for Shipping JRE 8 as Default: The decision to ship JRE 8 as the default likely considers factors such as backward compatibility, stability, and existing customer workloads. While competitors may choose newer versions by default, Databricks prioritizes stability and minimizing disruptions for existing users. However, the option to switch to JRE 17 demonstrates Databricksโ€™ commitment to staying current.

  3. Java 21 Support in DBR: Databricks currently supports Spark 4.0 features like PySpark DataSources and Variant type. Regarding Java 21, there are no specific dates.

View solution in original post

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @Witold

  1. Databricks Strategy on Java Runtimes: Databricks provides flexibility by shipping both JRE 8 and JRE 17 with the Databricks runtime. While JRE 8 is the default, you can specify the other JRE using the environment variable JNAME.

  2. Reason for Shipping JRE 8 as Default: The decision to ship JRE 8 as the default likely considers factors such as backward compatibility, stability, and existing customer workloads. While competitors may choose newer versions by default, Databricks prioritizes stability and minimizing disruptions for existing users. However, the option to switch to JRE 17 demonstrates Databricksโ€™ commitment to staying current.

  3. Java 21 Support in DBR: Databricks currently supports Spark 4.0 features like PySpark DataSources and Variant type. Regarding Java 21, there are no specific dates.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group