Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Did you find any documentation for spark.conf.get properties? I am trying to get some metadata about the environment my notebook is running in (specifically cluster custom tags)? But cannot find any information beside a couple of forum posts.
I ended up using the code below to get a list of the tags for the cluster my notebook is running on. I only found this property thanks to another forum post (Azure Spark Configuration (Environment) Documentation - Microsoft Q&A). cluster_tags = spark.conf.get("spark.databricks.clusterUsageTags.clusterAllTags")
It would be nice if there was some type of documentation for this type of stuff somewhere because it seems to not exists and this is the only way to get environment properties. I am also looking for a way to get tags from my Azure resource group from the cluster level through a notebook. Does anyone know if this is possible?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.