cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Generic pipeline with Databricks workflows with multiple triggers on a single job

Soufiane_Darraz
New Contributor II

A big limitation of Databricks Workflows is that you canโ€™t have multiple triggers on a single job. If you have a generic pipeline using Databricks notebooks and need to trigger it at different times for different sources, thereโ€™s no built-in way to handle this. In contrast, ADF allows you to easily set up multiple triggers on a single pipeline, making scheduling much more flexible. How do you handle this in Databricks Workflows?

1 ACCEPTED SOLUTION

Accepted Solutions

ashraf1395
Honored Contributor

Hi there @Soufiane_Darraz , completely aggreed with this point. it becomes frustrating when we cannot use multiple triggers in our workflows. Some examples we use in our databricks works or have seen being used in the industry are

- Simple : Using an external tool for orchestration : Like you mentioned adf/ airflow use it for triggering the workflow you want to run for multiple usecases or sources , you can parameterize the workflow with different job parameters according to your use case lets says sources or frequency and can send those params to the workflow at the time of trigger

- If you don't want to use any additional orchestration tool you can simple use databricks rest apis with some serverless instances example lambda functions which trigger the workflows based on the different parameters and listens to those trigger event for orchestration

- if you want to be completely databricks native you can have multiple workflows / parent child workflow set up where the parent workflow runs on a fixed schedule and triggers the child worflows depending on a metadata table or simplete if else task and runs them.

 

All the methods are useful and depends on what your usecase is , for different time based runs - I have personally tried databricks API with servererless functions or eventbridge it works well
For multiple sources trigger I have used parent child workflow with a metadata table / if else tasks with metadata table 

View solution in original post

2 REPLIES 2

ashraf1395
Honored Contributor

Hi there @Soufiane_Darraz , completely aggreed with this point. it becomes frustrating when we cannot use multiple triggers in our workflows. Some examples we use in our databricks works or have seen being used in the industry are

- Simple : Using an external tool for orchestration : Like you mentioned adf/ airflow use it for triggering the workflow you want to run for multiple usecases or sources , you can parameterize the workflow with different job parameters according to your use case lets says sources or frequency and can send those params to the workflow at the time of trigger

- If you don't want to use any additional orchestration tool you can simple use databricks rest apis with some serverless instances example lambda functions which trigger the workflows based on the different parameters and listens to those trigger event for orchestration

- if you want to be completely databricks native you can have multiple workflows / parent child workflow set up where the parent workflow runs on a fixed schedule and triggers the child worflows depending on a metadata table or simplete if else task and runs them.

 

All the methods are useful and depends on what your usecase is , for different time based runs - I have personally tried databricks API with servererless functions or eventbridge it works well
For multiple sources trigger I have used parent child workflow with a metadata table / if else tasks with metadata table 

Hi Ashraf, Thanks for your reply. We chose option 1 because itโ€™s simple and integrates well with our current setup, but yeah this definitely needs to be improved in Workflows. Native support for multiple triggers would really make a difference in flexibility and maintainability.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now