I would like to know how to schedule a DLT pipeline using DAB's.
I'm trying to trigger a Delta Live Table pipeline using Databricks Asset Bundles. Below is my YAML code:
resources:
pipelines:
data_quality_pipelines:
name: data_quality_pipelines
trigger:
cron:
quartz_cron_schedule: "0 0 10 ? * Mon-Fri"
timezone_id: "America/Sao_Paulo"
continuous: false
catalog: ${bundle.target}
target: data_quality
serverless: true
libraries:
- notebook:
path: ../src/customfield_pipeline.ipynb
- notebook:
path: ../src/customfieldvalue_pipeline.ipynb
- notebook:
path: ../src/customer_pipeline.ipynb
- notebook:
path: ../src/team_pipeline.ipynb
- notebook:
path: ../src/user_pipeline.ipynb
configuration:
env_conf_file: ${var.env_conf_file}
rules_conf_file: ${var.rules_conf_file}
After I deploy the bundle, the following error appears:
Uploading bundle files to /Workspace/Shared/deploy/.bundle/data_quality_pipelines/prod/files...
Deploying resources...
Updating deployment state...
Deployment complete!
Error: terraform apply: exit status 1
Error: cannot update pipeline: 'trigger' property is not supported yet.
with databricks_pipeline.data_quality_pipelines,
on bundle.tf.json line 61, in resource.databricks_pipeline.data_quality_pipelines:
61: }
I saw in the official documentation (Databricks API: Create Pipeline) that the trigger argument is deprecated. They recommend using the continuous argument, but I cannot configure the schedule with this command.
Does anyone know how to schedule a DLT pipeline using Databricks Asset Bundles? Should I use Databricks Workflows to orchestrate that?