- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-16-2021 02:11 AM
Hi there,
I am developing a Cluster node initialization script (https://docs.gcp.databricks.com/clusters/init-scripts.html#environment-variables) in order to install some custom libraries.
Reading the docs of Databricks we can get some environment variables with data related with the current running cluster node.
But I need to figure out what Spark & Scala version is currently been deployed. Is this possible?
Thanks in advance
Regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2022 06:14 AM
You should be able to just pick the version that matches Spark and Scala from maven.
Here is a simple way to get the cluster Spark version
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2022 06:26 AM
The question is about an init script though
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-14-2022 09:02 AM
We can infer the cluster DBR version using the env $DATABRICKS_RUNTIME_VERSION. (For the exact spark/scala version mapping, you can refer to the specific DBR release notes)
Sample usage inside a init script,
DBR_10_4_VERSION="10.4"
if [[ "$DATABRICKS_RUNTIME_VERSION" == "$DBR_10_4_VERSION"* ]]; then
echo "running 10.4 specific commands"
else
echo "Skipping 10.4 specific commands"
fi


- « Previous
-
- 1
- 2
- Next »