- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-25-2021 10:45 AM
Does anybody know any in-notebook or JAR code to pull cluster tags from the runtime environment?
Something like...
dbutils.notebook.entry_point.getDbutils().notebook().getContext().tags().apply('user')
but for the cluster name?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-25-2021 10:46 AM
Turns out this is super easy! Cluster name and other customer tags are part of the Spark Conf:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-25-2021 10:46 AM
Turns out this is super easy! Cluster name and other customer tags are part of the Spark Conf:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2023 11:37 AM
Did you find any documentation for spark.conf.get properties? I am trying to get some metadata about the environment my notebook is running in (specifically cluster custom tags)? But cannot find any information beside a couple of forum posts.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2023 11:41 AM
UPDATE
I ended up using the code below to get a list of the tags for the cluster my notebook is running on. I only found this property thanks to another forum post (Azure Spark Configuration (Environment) Documentation - Microsoft Q&A).
cluster_tags = spark.conf.get("spark.databricks.clusterUsageTags.clusterAllTags")
It would be nice if there was some type of documentation for this type of stuff somewhere because it seems to not exists and this is the only way to get environment properties. I am also looking for a way to get tags from my Azure resource group from the cluster level through a notebook. Does anyone know if this is possible?