Hi there @Soufiane_Darraz , completely aggreed with this point. it becomes frustrating when we cannot use multiple triggers in our workflows. Some examples we use in our databricks works or have seen being used in the industry are
- Simple : Using an external tool for orchestration : Like you mentioned adf/ airflow use it for triggering the workflow you want to run for multiple usecases or sources , you can parameterize the workflow with different job parameters according to your use case lets says sources or frequency and can send those params to the workflow at the time of trigger
- If you don't want to use any additional orchestration tool you can simple use databricks rest apis with some serverless instances example lambda functions which trigger the workflows based on the different parameters and listens to those trigger event for orchestration
- if you want to be completely databricks native you can have multiple workflows / parent child workflow set up where the parent workflow runs on a fixed schedule and triggers the child worflows depending on a metadata table or simplete if else task and runs them.
All the methods are useful and depends on what your usecase is , for different time based runs - I have personally tried databricks API with servererless functions or eventbridge it works well
For multiple sources trigger I have used parent child workflow with a metadata table / if else tasks with metadata table