Capturing notebook return codes in databricks jobs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-28-2022 06:17 AM
Hi,
I currently am running a number of notebook jobs from Azure Data Factory. A new requirement has come up where I need to capture a return code in ADF that has been generated from the note. I tried using
dbutils.notebook.exit(json.dumps({"return_value":"[some text or code]})) but this is not visible from the output of the job. I could however see this value if I triggered that notebook from a call notebook task in ADF.
Am I missing something, any assistance greatly appreciated
- Labels:
-
Azure data factory
-
Databricks jobs
-
JOBS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
I am looking for this too, as I want to run a job from an Azure DevOps pipeline authorized to access Databricks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
I tried dbutils.notebook.exit(<value>) where value is 0 and 1. I also tried sys.exit(<value>).
None of this produces a difference in the output of the databricks CLI command
databricks jobs run-now --json '{"job_id": <JOB_ID>, "job_parameters": {"will_exit": "yes"}}'
Furthermore, if I use sys.exit(<value>) the API doesn't return a result in JSON, but fails:
⣷ Error: failed to reach TERMINATED or SKIPPED, got INTERNAL_ERROR: Task IntegrationTest failed with message: Workload failed, see run output for details. This caused all downstream tasks to get skipped
It's a pity...

