โ07-13-2023 11:26 PM
Hello,
I have a job with many tasks running on a schedule, and the first tasks checks a condition. Based on the condition, I would either want to continue the job as normal, or to stop right away and don't run all the other tasks. Is there a way to do this without having the job in a failed state ?
โ07-14-2023 06:33 AM
I would want the full job to stop (including subsequent tasks) but running dbutils.notebook.exit() will success the current task and move on to the next task right ? I would want the job to finish right away in a success state
โ07-14-2023 07:35 AM
Yes, you're correct. The downstream tasks would still be triggered.
You can try cancelling the run using the Jobs API? But I don't think that is what you are looking for as you want the job run to have success state.
Maybe propagating the check using Task values can be an option or simply add the check to every task.
โ07-14-2023 07:45 AM
I think the best way to accomplish this would be to either propagate the check, as mentionned by @menotron, or have the initial task in another job, and only run the second job if the condition is met. Obviously it depends on the use case. Thank you for taking the time to answer !
โ07-14-2023 05:51 AM
You can use the dbutils.notebook.exit("returnValue") as explained here stop execution of a notebook gracefully.
if condition:
pass
else:
dbutils.notebook.exit("Aborting as ondition not met. Further tasks will be skipped")
โ07-14-2023 06:14 AM
@erigaud Hi. The below two approaches could help.
1. dbutils.notebook.exit() --> This will stop the job. You can even pass any values in the parenthesis to print based on your requirement.
2. Using sys.exit(0) -> This comes with sys module and you can use this as well to exit your job. Both will work. You can try and let me know.
โ07-14-2023 06:33 AM
I would want the full job to stop (including subsequent tasks) but running dbutils.notebook.exit() will success the current task and move on to the next task right ? I would want the job to finish right away in a success state
โ07-14-2023 07:35 AM
Yes, you're correct. The downstream tasks would still be triggered.
You can try cancelling the run using the Jobs API? But I don't think that is what you are looking for as you want the job run to have success state.
Maybe propagating the check using Task values can be an option or simply add the check to every task.
โ07-14-2023 07:45 AM
I think the best way to accomplish this would be to either propagate the check, as mentionned by @menotron, or have the initial task in another job, and only run the second job if the condition is met. Obviously it depends on the use case. Thank you for taking the time to answer !
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group