cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to determine if am using the same DBR minor version?

brickster_2018
Databricks Employee
Databricks Employee

DBR minor version details are not exposed. However, in the documentation, it mentioned Databricks performs maintenance releases every 2 weeks. How can I determine if I am using the same minor version

1 ACCEPTED SOLUTION

Accepted Solutions

brickster_2018
Databricks Employee
Databricks Employee

The below code snippet can help to determine the DBR Hash string for the DBR version. DBR hash string is unique for the DBR minor version.

val scalaVersion = scala.util.Properties.versionString
 
val hadoopVersion = org.apache.hadoop.util.VersionInfo.getVersion
 
val baseVersion = org.apache.spark.BuildInfo.sparkBranch
 
val baseSha = org.apache.spark.BuildInfo.apacheBase
 
val dbrSha = org.apache.spark.BuildInfo.gitHash
 
val universeSha = com.databricks.BuildInfo.gitHash
 
val universeTimestamp = com.databricks.BuildInfo.gitTimestamp
 
val imageId = spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
 
 

View solution in original post

1 REPLY 1

brickster_2018
Databricks Employee
Databricks Employee

The below code snippet can help to determine the DBR Hash string for the DBR version. DBR hash string is unique for the DBR minor version.

val scalaVersion = scala.util.Properties.versionString
 
val hadoopVersion = org.apache.hadoop.util.VersionInfo.getVersion
 
val baseVersion = org.apache.spark.BuildInfo.sparkBranch
 
val baseSha = org.apache.spark.BuildInfo.apacheBase
 
val dbrSha = org.apache.spark.BuildInfo.gitHash
 
val universeSha = com.databricks.BuildInfo.gitHash
 
val universeTimestamp = com.databricks.BuildInfo.gitTimestamp
 
val imageId = spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
 
 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group