cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to determine if am using the same DBR minor version?

brickster_2018
Esteemed Contributor

DBR minor version details are not exposed. However, in the documentation, it mentioned Databricks performs maintenance releases every 2 weeks. How can I determine if I am using the same minor version

1 ACCEPTED SOLUTION

Accepted Solutions

brickster_2018
Esteemed Contributor

The below code snippet can help to determine the DBR Hash string for the DBR version. DBR hash string is unique for the DBR minor version.

val scalaVersion = scala.util.Properties.versionString
 
val hadoopVersion = org.apache.hadoop.util.VersionInfo.getVersion
 
val baseVersion = org.apache.spark.BuildInfo.sparkBranch
 
val baseSha = org.apache.spark.BuildInfo.apacheBase
 
val dbrSha = org.apache.spark.BuildInfo.gitHash
 
val universeSha = com.databricks.BuildInfo.gitHash
 
val universeTimestamp = com.databricks.BuildInfo.gitTimestamp
 
val imageId = spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
 
 

View solution in original post

1 REPLY 1

brickster_2018
Esteemed Contributor

The below code snippet can help to determine the DBR Hash string for the DBR version. DBR hash string is unique for the DBR minor version.

val scalaVersion = scala.util.Properties.versionString
 
val hadoopVersion = org.apache.hadoop.util.VersionInfo.getVersion
 
val baseVersion = org.apache.spark.BuildInfo.sparkBranch
 
val baseSha = org.apache.spark.BuildInfo.apacheBase
 
val dbrSha = org.apache.spark.BuildInfo.gitHash
 
val universeSha = com.databricks.BuildInfo.gitHash
 
val universeTimestamp = com.databricks.BuildInfo.gitTimestamp
 
val imageId = spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
 
 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group