We are currently starting to build certain data pipelines using Databricks.
For this we use Jobs and the steps in these Jobs are implemented in Python Wheels.
We are able to retrieve the Job ID, Job Run ID and Task Run Id in our Python Wheels from the SparkContext using:
from pyspark.context import SparkContext
sc = SparkContext.getOrCreate()
sc.getLocalProperty("spark.databricks.job.id")
sc.getLocalProperty("spark.databricks.job.parentRunId")
sc.getLocalProperty("spark.databricks.job.runId")
Now we would also like to get the name of the current Job which is running the wheel. Is there any local property containing that information? So far we did not manage to retrieve all available Keys which are stored as local Properties. Is there any documentation describing all available Properties? We did not find anything out there...
Thanks in advance for your advice!