Currently we are facing a challenge with below use case:
The Airflow DAG has 4 tasks (Task1, Task2, Task3 and Task4) and The dependency is like this
Task 1>> Task2 >> Task3 >> Task4 (All tasks are spark-jar task types
In Airflow DAG for Task2, there is a dynamic skip condition, if the feature flag is not enabled then skip the Task2 execution and hence the Task3 and Task4 are also skipped due to the upstream task (Task2) skipped.
Issue: As stated above Task2 got skipped in Airflow DAG UI execution, but in Databricks Workflow the Task2 is still continuing the execution, which is creating the unwanted data insertion in the system. (Same for Task3 and Task4 also)
Question: We found that there is conditional tasks (if/else) task type, we don’t have a provision to create the conditional tasks from Airflow operator as per the latest Airflow version (2.10.4 and Databricks-provide 7.0.0). Is there any provision to skip the tasks in the Databricks workflow based on the airflow task status ?