Resolved! Getting Spark & Scala version in Cluster node initialization script
Hi there, I am developing a Cluster node initialization script (https://docs.gcp.databricks.com/clusters/init-scripts.html#environment-variables) in order to install some custom libraries.Reading the docs of Databricks we can get some environment var...
- 13722 Views
- 17 replies
- 3 kudos
Latest Reply
We can infer the cluster DBR version using the env $DATABRICKS_RUNTIME_VERSION. (For the exact spark/scala version mapping, you can refer to the specific DBR release notes)Sample usage inside a init script, DBR_10_4_VERSION="10.4" if [[ "$DATABRICKS_...
- 3 kudos