cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Is it possible to get Job Run ID of notebook run by dbutils.notbook.run?

hanspetter
New Contributor III

When running a notebook using dbutils.notebook.run from a master-notebook, an url to that running notebook is printed, i.e.:

Notebook job #223150

Notebook job #223151

Are there any ways to capture that Job Run ID (#223150 or #223151)? We have 50 or so notebooks that runs in parallell, and if one of them fails it would be nice to see the actual run of the notebook without clicking every url to find the correct one.

Thanks 🙂

19 REPLIES 19

Thank you very mush for the share !

dbutils.notebook.entry_point is even not given by the official doc:

https://docs.databricks.com/dev-tools/databricks-utils.html#notebook-utility-dbutilsnotebook

415963
New Contributor II

Is there also a way to achieve this, when the notebook does not successfully complete. E.g.: when a structured streaming query fails.

wahanand
New Contributor II

Spark config value for "spark.databricks.clusterUsageTags.clusterAllTags" has Json which contains jobId,runId,clusterId... etc

The below article will help you.

https://community.databricks.com/s/feed/0D53f00001HKHjxCAH

DungTran
New Contributor II

I would like to add that this works too:

dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get()

Rodrigo_Mohr
New Contributor II

I know this is an old thread, but sharing what is working for me well in Python now, for retrieving the run_id as well and building the entire link to that job run:

job_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get()
run_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().currentRunId().get().id()
workspace_url = spark.conf.get("spark.databricks.workspaceUrl")
full_run_url = f"http://{workspace_url}/jobs/{job_id}/runs/{run_id}"
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.