cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Workflows: Running dependent task despite earlier task fail

sharpbetty
New Contributor II

I have a scheduled task running in workflow.

Task 1 computes some parameters then these are picked up by a dependent reporting task: Task 2.

I want Task 2 to report "Failure" if Task 1 fails. Yet creating a dependency in workflows means that Task 2 will not run if Task 1 fails.

Any suggestions how I can keep the parameter sharing and dependency from Task 1 to Task 2, yet also allow Task 2 to fire even on failure of Task 1?

Edit: Have screenshot attached now, showing Task 2 skipped on a fail for Task 1.

3 REPLIES 3

Kaniz_Fatma
Community Manager
Community Manager

Hi @sharpbetty , 

 

One possible solution is to use task values to pass the parameters from Task 1 to Task 2 and then use a try-except block in Task 2 to handle the case where Task 1 fails. Here are the steps to implement this solution:
 
1. In Task 1, use the dbutils.jobs.taskValues.set() command to set the parameters as task values. For example:   
python
   dbutils.jobs.taskValues.set(key="param1", value=42)
   dbutils.jobs.taskValues.set(key="param2", value="foo")
   
2. In Task 2, use the dbutils.jobs.taskValues.get() command to retrieve the parameters as task values. For example:   
python
   try:
       param1 = dbutils.jobs.taskValues.get(taskKey="Task 1", key="param1")
       param2 = dbutils.jobs.taskValues.get(taskKey="Task 1", key="param2")
   except ValueError:
       # Handle the case where Task 1 failed
       param1 = None
       param2 = None
   
3. Use the param1 and param2 variables in Task 2 to generate the report.4. If you want Task 2 to report "Failure" if Task 1 fails, you can add a check at the end of Task 2 to see if param1 and param2 are None. If they are, then Task 1 failed, and Task 2 should report "Failure".
 
For example:   
python
   if param1 is None or param2 is None:
       # Task 1 failed, report "Failure"
       dbutils.notebook.exit("Failure")
   else:
       # Task 1 succeeded, generate the report
       ...
   
Task 2 can still run even if Task 1 fails by using task values to pass the parameters between functions. The try-except block in Task 2 allows you to handle the case where Task 1 fails and still generate the report if Task 1 succeeds. The check at the end of Task 2 will enable you to report "Failure" if Task 1 fails.

Thanks @Kaniz_Fatma for your reply!

But I'm a little lost. I have found that if Task 1 fails, Task 2 is skipped, due to the dependency. (See screenshot). So there is no way I can see to execute Task 2 on a failure at Task 1.

I could disconnect the tasks (remove dependency), but my understanding is that if both tasks exist independently in the workflow, they will fire in parallel on schedule, meaning Task 2 doesn't wait for the parameters generated from Task 1.

I also thought of loading the tasks into different workflows, on a staggered schedule, but then I can't pass the parameters between them as they are in different flows.

NerdSan
New Contributor II

Hi @sharpbetty , 

Any suggestions how I can keep the parameter sharing and dependency from Task 1 to Task 2, yet also allow Task 2 to fire even on failure of Task 1?


Setup:

  • Task 2 dependent on Task1

 Challenge:

  •  To Fire Task 2 even on Task 1 Failure

Solution:

  • Do "not" Fail Task 1, Handle exception and set parameters in Task 1 ( as suggested by @Kaniz_Fatma ) 
  • In Task 2, Utilize Task 1 parameter value to decide next steps

Hope this helps. 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!