A similar question has already been added, but the reply is very confusing to me.
Basically, for automated jobs, I want to log the following information from inside a Python notebook that runs in the job:
- What is the cluster configuration (most imporant, what machine type and how many?)
- Which user triggered the run
- And maybe some other stuff.
I know that all this information can be accessed by using the Cluster API ( https://docs.databricks.com/dev-tools/api/latest/clusters.html#get ) - however, I am not sure how to do this while on the cluster itself. Is there an easier way, like going through dbutils? If not, how would authorization work, since the job might be started by anyone on our instance? I would highly appreciate some sample code that always works when on the cluster itself... Thanks!