cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Gracefully stop a job based on condition

erigaud
Honored Contributor

Hello, 

I have a job with many tasks running on a schedule, and the first tasks checks a condition. Based on the condition, I would either want to continue the job as normal, or to stop right away and don't run all the other tasks. Is there a way to do this without having the job in a failed state ? 

 

3 ACCEPTED SOLUTIONS

Accepted Solutions

erigaud
Honored Contributor

I would want the full job to stop (including subsequent tasks) but running dbutils.notebook.exit()  will success the current task and move on to the next task right ? I would want the job to finish right away in a success state

View solution in original post

menotron
New Contributor III

Yes, you're correct. The downstream tasks would still be triggered. 
You can try cancelling the run using the Jobs API? But I don't think that is what you are looking for as you want the job run to have success state.

Maybe propagating the check using Task values can be an option or simply add the check to every task.

View solution in original post

erigaud
Honored Contributor

I think the best way to accomplish this would be to either propagate the check, as mentionned by @menotron, or have the initial task in another job, and only run the second job if the condition is met. Obviously it depends on the use case. Thank you for taking the time to answer !

 

View solution in original post

5 REPLIES 5

menotron
New Contributor III

You can use the dbutils.notebook.exit("returnValue") as explained here stop execution of a notebook gracefully.

 

if condition:
  pass
else:
  dbutils.notebook.exit("Aborting as ondition not met. Further tasks will be skipped")

 

pvignesh92
Honored Contributor

@erigaud Hi. The below two approaches could help. 

1. dbutils.notebook.exit() --> This will stop the job. You can even pass any values in the parenthesis to print based on your requirement.

2. Using sys.exit(0) -> This comes with sys module and you can use this as well to exit your job. Both will work. You can try and let me know. 

erigaud
Honored Contributor

I would want the full job to stop (including subsequent tasks) but running dbutils.notebook.exit()  will success the current task and move on to the next task right ? I would want the job to finish right away in a success state

menotron
New Contributor III

Yes, you're correct. The downstream tasks would still be triggered. 
You can try cancelling the run using the Jobs API? But I don't think that is what you are looking for as you want the job run to have success state.

Maybe propagating the check using Task values can be an option or simply add the check to every task.

erigaud
Honored Contributor

I think the best way to accomplish this would be to either propagate the check, as mentionned by @menotron, or have the initial task in another job, and only run the second job if the condition is met. Obviously it depends on the use case. Thank you for taking the time to answer !

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.