- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 06:29 AM
Thank you very mush for the share !
dbutils.notebook.entry_point is even not given by the official doc:
https://docs.databricks.com/dev-tools/databricks-utils.html#notebook-utility-dbutilsnotebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-25-2023 08:18 AM
Is there also a way to achieve this, when the notebook does not successfully complete. E.g.: when a structured streaming query fails.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-02-2020 07:07 PM
Spark config value for "spark.databricks.clusterUsageTags.clusterAllTags" has Json which contains jobId,runId,clusterId... etc
The below article will help you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-10-2021 08:52 AM
I would like to add that this works too:
dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get()- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-11-2024 12:49 PM
I know this is an old thread, but sharing what is working for me well in Python now, for retrieving the run_id as well and building the entire link to that job run:
job_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().jobId().get()
run_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().currentRunId().get().id()
workspace_url = spark.conf.get("spark.databricks.workspaceUrl")
full_run_url = f"http://{workspace_url}/jobs/{job_id}/runs/{run_id}"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-21-2025 08:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-05-2025 10:57 AM
Thanks for the response @Manoj5 - I had to use this "safeToJson()" option too because all of the previous suggestions in this thread were erroring out for me with a message like "py4j.security.Py4JSecurityException: Method public java.lang.String com.databricks.backend.common.rpc.CommandContext.toJson() is not whitelisted on class class com.databricks.backend.common.rpc.CommandContext"
Also, as a more general comment, it's incredibly frustrating that none of this is documented, or if it is, it's organized so poorly that I cannot find it.