- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
A big limitation of Databricks Workflows is that you canโt have multiple triggers on a single job. If you have a generic pipeline using Databricks notebooks and need to trigger it at different times for different sources, thereโs no built-in way to handle this. In contrast, ADF allows you to easily set up multiple triggers on a single pipeline, making scheduling much more flexible. How do you handle this in Databricks Workflows?
- Labels:
-
Workflows
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi there @Soufiane_Darraz , completely aggreed with this point. it becomes frustrating when we cannot use multiple triggers in our workflows. Some examples we use in our databricks works or have seen being used in the industry are
- Simple : Using an external tool for orchestration : Like you mentioned adf/ airflow use it for triggering the workflow you want to run for multiple usecases or sources , you can parameterize the workflow with different job parameters according to your use case lets says sources or frequency and can send those params to the workflow at the time of trigger
- If you don't want to use any additional orchestration tool you can simple use databricks rest apis with some serverless instances example lambda functions which trigger the workflows based on the different parameters and listens to those trigger event for orchestration
- if you want to be completely databricks native you can have multiple workflows / parent child workflow set up where the parent workflow runs on a fixed schedule and triggers the child worflows depending on a metadata table or simplete if else task and runs them.
All the methods are useful and depends on what your usecase is , for different time based runs - I have personally tried databricks API with servererless functions or eventbridge it works well
For multiple sources trigger I have used parent child workflow with a metadata table / if else tasks with metadata table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi there @Soufiane_Darraz , completely aggreed with this point. it becomes frustrating when we cannot use multiple triggers in our workflows. Some examples we use in our databricks works or have seen being used in the industry are
- Simple : Using an external tool for orchestration : Like you mentioned adf/ airflow use it for triggering the workflow you want to run for multiple usecases or sources , you can parameterize the workflow with different job parameters according to your use case lets says sources or frequency and can send those params to the workflow at the time of trigger
- If you don't want to use any additional orchestration tool you can simple use databricks rest apis with some serverless instances example lambda functions which trigger the workflows based on the different parameters and listens to those trigger event for orchestration
- if you want to be completely databricks native you can have multiple workflows / parent child workflow set up where the parent workflow runs on a fixed schedule and triggers the child worflows depending on a metadata table or simplete if else task and runs them.
All the methods are useful and depends on what your usecase is , for different time based runs - I have personally tried databricks API with servererless functions or eventbridge it works well
For multiple sources trigger I have used parent child workflow with a metadata table / if else tasks with metadata table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi Ashraf, Thanks for your reply. We chose option 1 because itโs simple and integrates well with our current setup, but yeah this definitely needs to be improved in Workflows. Native support for multiple triggers would really make a difference in flexibility and maintainability.

