Hello, thank you for your question!
Here’s a general approach to achieve this, but please let us know if the requirement understanding does not align:
-
Create a Parent Workflow with a Single Scheduled Trigger:
- Schedule the workflow to run hourly since that is the more frequent batch type.
- Use a master task that queries the metadata table to determine which DLT pipelines should run in that execution.
-
Use a Conditional Execution Mechanism:
- Add a notebook task as the first step in the workflow that:
- Reads the metadata table (which contains schedule information).
- Determines if the run is hourly or daily based on the current timestamp.
- Sets workflow variables or
dbutils.jobs.taskValues() for downstream task execution.
-
Configure Dynamic Task Execution:
- Define one task per DLT pipeline in the workflow.
- Use conditional execution (Run if condition is met) to ensure that:
- Hourly pipelines run on every execution.
- Daily pipelines run only when the master task determines it's a daily run
-
Use dbutils.jobs.taskValues() to Control Execution:
-
Then, configure each pipeline task with Depends on the master task and set execution conditions based on the variable.
Alternative Approach: Two Separate Workflows
If this is still not having enough flexible conditional execution for your needs, consider:
- A daily workflow (triggered once per day).
- An hourly workflow (triggered every hour).
- Both workflows query the metadata table and only trigger relevant DLT pipelines.
Please let me know if you're question was meant to be more specifically addressed, and/or if the above needs further clarification. In the meantime, hope it helps!