Scheduling multiple jobs (workflows) in DABs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 03:14 AM - edited 01-27-2025 03:15 AM
Hello, I'm wondering how can I schedule multiple jobs (workflow).
I'd like to do something like this but on a workflow level.
tasks:
- task_key: task_1
sql_task:
warehouse_id: ${var.warehouse_id}
parameters:
...
file:
path: ../src/task_1.sql
- task_key: task_2
depends_on:
- task_key: drop_tables
sql_task:
warehouse_id: ${var.warehouse_id}
parameters:
...
file:
path: ../src/task_2.sql
and what I want is a master workflow like
jobs/workflows:
- job/workflow_key: workflow_1
file:
path: ../resources/workflow_1.sql
- job/workflow_key: workflow_2
depends_on:
- workflow_key: workflow_1
file:
path: ../resources/workflow_2.sql
I'd really appreciate any resources/examples on this - thank you!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 04:38 AM
Hi @Greg_c,
You can try with this sctructure:
- In the main databricks.yml
# databricks.yml
bundle:
name: master-bundle
include:
- resources/*.yml
# Other bundle configurations...
In resource directory, create a YAML for each job:
# resources/job1.yml
resources:
jobs:
job1:
name: First Job
tasks:
- task_key: task1
notebook_task:
notebook_path: /path/to/notebook1
# resources/job2.yml
resources:
jobs:
job2:
name: Second Job
tasks:
- task_key: task2
notebook_task:
notebook_path: /path/to/notebook2
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 06:33 AM
How exactly do I configure it here?
# databricks.yml
bundle:
name: master-bundle
include:
- resources/*.yml
In what order will this run these jobs? I want to be able to set the dependencies etc. myself. Please provide whole code for this examples

