cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Capturing notebook return codes in databricks jobs

labromb
Contributor

Hi,

I currently am running a number of notebook jobs from Azure Data Factory. A new requirement has come up where I need to capture a return code in ADF that has been generated from the note. I tried using  

dbutils.notebook.exit(json.dumps({"return_value":"[some text or code]})) but this is not visible from the output of the job. I could however see this value if I triggered that notebook from a call notebook task in ADF.

Am I missing something, any assistance greatly appreciated

2 REPLIES 2

JaviRuiz
New Contributor II

I am looking for this too, as I want to run a job from an Azure DevOps pipeline authorized to access Databricks.

JaviRuiz
New Contributor II

I tried dbutils.notebook.exit(<value>) where value is 0 and 1. I also tried sys.exit(<value>). 
None of this produces a difference in the output of the databricks CLI command

databricks jobs run-now --json '{"job_id": <JOB_ID>, "job_parameters": {"will_exit": "yes"}}'

Furthermore, if I use sys.exit(<value>) the API doesn't return a result in JSON, but fails:

⣷ Error: failed to reach TERMINATED or SKIPPED, got INTERNAL_ERROR: Task IntegrationTest failed with message: Workload failed, see run output for details. This caused all downstream tasks to get skipped

It's a pity...




Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now