08-02-2017 12:26 AM
When running a notebook using dbutils.notebook.run from a master-notebook, an url to that running notebook is printed, i.e.:
Notebook job #223150
Notebook job #223151Are there any ways to capture that Job Run ID (#223150 or #223151)? We have 50 or so notebooks that runs in parallell, and if one of them fails it would be nice to see the actual run of the notebook without clicking every url to find the correct one.
Thanks 🙂
04-05-2022 06:29 AM
Thank you very mush for the share !
dbutils.notebook.entry_point is even not given by the official doc:
https://docs.databricks.com/dev-tools/databricks-utils.html#notebook-utility-dbutilsnotebook
07-25-2023 08:18 AM
Is there also a way to achieve this, when the notebook does not successfully complete. E.g.: when a structured streaming query fails.
09-02-2020 07:07 PM
Spark config value for "spark.databricks.clusterUsageTags.clusterAllTags" has Json which contains jobId,runId,clusterId... etc
The below article will help you.
08-10-2021 08:52 AM
I would like to add that this works too:
dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get()04-11-2024 12:49 PM
I know this is an old thread, but sharing what is working for me well in Python now, for retrieving the run_id as well and building the entire link to that job run:
job_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get()
run_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().currentRunId().get().id()
workspace_url = spark.conf.get("spark.databricks.workspaceUrl")
full_run_url = f"http://{workspace_url}/jobs/{job_id}/runs/{run_id}"
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group