Databricks Asset Bundle - variables for job trigger
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-19-2024 04:31 PM - edited 11-19-2024 04:35 PM
I'm using the Databricks CLI to deploy an asset bundle for a job. I'm attempting to setup the configuration such that the "dev" target does not have a trigger on it, and the "prod" target does. Essentially the dev job is not scheduled to run and the prod job is. I'd really like there to not even be a schedule on the dev job, rather than having it simply paused.
Is this possible?
variables:
trigger:
description: The trigger information
type: complex
default:
pause_status: UNPAUSED
periodic:
interval: 1
unit: DAYS
resources:
jobs:
test_20241118:
name: ${bundle.target} - test_20241118
trigger: ${var.trigger}
tasks:
- task_key: hello-task
notebook_task:
notebook_path: ${workspace.file_path}/hello_world/hello
source: WORKSPACE
targets:
dev:
default: true
workspace:
host: https://my-dev-workspace.azuredatabricks.net
variables:
trigger: # I WANT THIS TO ACTUALLY NOT EVEN EXIST
description: The trigger information
type: complex
default:
pause_status: PAUSED
periodic:
interval: 1
unit: DAYS
prod:
default: true
workspace:
host: https://my-prod-workspace.azuredatabricks.net
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-21-2024 10:50 PM
in your dev target, you can add mode to pause all trigger.
targets:
dev:
mode: development
DAB also have new update,
you can also use preset to handle different target setting.
targets:
dev:
presets:
name_prefix: "testing_" # prefix all resource names with testing_
pipelines_development: true # set development to true for pipelines
trigger_pause_status: PAUSED # set pause_status to PAUSED for all triggers and schedules
jobs_max_concurrent_runs: 10 # set max_concurrent runs to 10 for all jobs
tags:
department: finance
Ref: Databricks Asset Bundle deployment modes | Databricks on AWS

