Databricks tasks are not skipping if running tasks using Airflow DatabricksworkflowTaskgroup
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2025 06:09 AM
Currently we are facing a challenge with below use case:
The Airflow DAG has 4 tasks (Task1, Task2, Task3 and Task4) and The dependency is like this
Task 1>> Task2 >> Task3 >> Task4 (All tasks are spark-jar task types
In Airflow DAG for Task2, there is a dynamic skip condition, if the feature flag is not enabled then skip the Task2 execution and hence the Task3 and Task4 are also skipped due to the upstream task (Task2) skipped.
Issue: As stated above Task2 got skipped in Airflow DAG UI execution, but in Databricks Workflow the Task2 is still continuing the execution, which is creating the unwanted data insertion in the system. (Same for Task3 and Task4 also)
Question: We found that there is conditional tasks (if/else) task type, we don’t have a provision to create the conditional tasks from Airflow operator as per the latest Airflow version (2.10.4 and Databricks-provide 7.0.0). Is there any provision to skip the tasks in the Databricks workflow based on the airflow task status ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2025 06:48 AM
Hi @anil_reddaboina,
Databricks allows you to add control flow logic to tasks based on the success, failure, or completion of their dependencies. This can be achieved using the "Run if" dependencies fiel: https://docs.databricks.com/aws/en/jobs/run-if
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-02-2025 08:34 PM
Thanks for your message
Creating the "Run if"or condition task type is currently not available with Airflow databricks provider.

