Did you find any documentation for spark.conf.get properties? I am trying to get some metadata about the environment my notebook is running in (specifically cluster custom tags)? But cannot find any information beside a couple of forum posts.
I ended up using the code below to get a list of the tags for the cluster my notebook is running on. I only found this property thanks to another forum post (Azure Spark Configuration (Environment) Documentation - Microsoft Q&A). cluster_tags = spark.conf.get("spark.databricks.clusterUsageTags.clusterAllTags")
It would be nice if there was some type of documentation for this type of stuff somewhere because it seems to not exists and this is the only way to get environment properties. I am also looking for a way to get tags from my Azure resource group from the cluster level through a notebook. Does anyone know if this is possible?
Welcome to Databricks Community: Lets learn, network and celebrate together
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.