<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic [Databricks Asset Bundles] Triggering Delta Live Tables in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108795#M43143</link>
    <description>&lt;P&gt;I would like to know how to schedule a DLT pipeline using DAB's.&lt;/P&gt;&lt;P&gt;I'm trying to trigger a Delta Live Table pipeline using Databricks Asset Bundles. Below is my YAML code:&lt;/P&gt;&lt;DIV&gt;&lt;PRE&gt;&lt;SPAN&gt;resources&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;pipelines&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;data_quality_pipelines&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;name&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;data_quality_pipelines&lt;BR /&gt;      trigger:&lt;BR /&gt;        cron:&lt;BR /&gt;          quartz_cron_schedule: "0 0 10 ? * Mon-Fri"&lt;BR /&gt;          timezone_id: "America/Sao_Paulo"&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;continuous&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;false&lt;/SPAN&gt;&lt;SPAN&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;catalog&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${bundle.target}&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;target&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;data_quality&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;serverless&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;true&lt;BR /&gt;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;libraries&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/customfield_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/customfieldvalue_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/customer_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/team_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/user_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;configuration&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;env_conf_file&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${var.env_conf_file}&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;rules_conf_file&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${var.rules_conf_file}&lt;/SPAN&gt;&lt;/PRE&gt;&lt;/DIV&gt;&lt;P&gt;After I deploy the bundle, the following error appears:&lt;/P&gt;&lt;PRE&gt;Uploading bundle files to /Workspace/Shared/deploy/.bundle/data_quality_pipelines/prod/files...&lt;BR /&gt;Deploying resources...&lt;BR /&gt;Updating deployment state...&lt;BR /&gt;Deployment complete!&lt;BR /&gt;&lt;BR /&gt;Error: terraform apply: exit status 1&lt;BR /&gt;&lt;BR /&gt;Error: cannot update pipeline: 'trigger' property is not supported yet.&lt;BR /&gt;&lt;BR /&gt;with databricks_pipeline.data_quality_pipelines,&lt;BR /&gt;on bundle.tf.json line 61, in resource.databricks_pipeline.data_quality_pipelines:&lt;BR /&gt;61: }&lt;/PRE&gt;&lt;P&gt;I saw in the official documentation (&lt;A href="https://docs.databricks.com/api/workspace/pipelines/create" target="_blank" rel="noopener"&gt;Databricks API: Create Pipeline&lt;/A&gt;) that the trigger argument is deprecated. They recommend using the continuous argument, but I cannot configure the schedule with this command.&lt;/P&gt;&lt;P&gt;Does anyone know how to schedule a DLT pipeline using Databricks Asset Bundles? Should I use Databricks Workflows to orchestrate that?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 04 Feb 2025 14:08:15 GMT</pubDate>
    <dc:creator>jonhieb</dc:creator>
    <dc:date>2025-02-04T14:08:15Z</dc:date>
    <item>
      <title>[Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108795#M43143</link>
      <description>&lt;P&gt;I would like to know how to schedule a DLT pipeline using DAB's.&lt;/P&gt;&lt;P&gt;I'm trying to trigger a Delta Live Table pipeline using Databricks Asset Bundles. Below is my YAML code:&lt;/P&gt;&lt;DIV&gt;&lt;PRE&gt;&lt;SPAN&gt;resources&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;pipelines&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;data_quality_pipelines&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;name&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;data_quality_pipelines&lt;BR /&gt;      trigger:&lt;BR /&gt;        cron:&lt;BR /&gt;          quartz_cron_schedule: "0 0 10 ? * Mon-Fri"&lt;BR /&gt;          timezone_id: "America/Sao_Paulo"&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;continuous&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;false&lt;/SPAN&gt;&lt;SPAN&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;catalog&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${bundle.target}&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;target&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;data_quality&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;serverless&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;true&lt;BR /&gt;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;libraries&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/customfield_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/customfieldvalue_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/customer_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/team_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;notebook&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;path&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;../src/user_pipeline.ipynb&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;configuration&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;env_conf_file&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${var.env_conf_file}&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;rules_conf_file&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${var.rules_conf_file}&lt;/SPAN&gt;&lt;/PRE&gt;&lt;/DIV&gt;&lt;P&gt;After I deploy the bundle, the following error appears:&lt;/P&gt;&lt;PRE&gt;Uploading bundle files to /Workspace/Shared/deploy/.bundle/data_quality_pipelines/prod/files...&lt;BR /&gt;Deploying resources...&lt;BR /&gt;Updating deployment state...&lt;BR /&gt;Deployment complete!&lt;BR /&gt;&lt;BR /&gt;Error: terraform apply: exit status 1&lt;BR /&gt;&lt;BR /&gt;Error: cannot update pipeline: 'trigger' property is not supported yet.&lt;BR /&gt;&lt;BR /&gt;with databricks_pipeline.data_quality_pipelines,&lt;BR /&gt;on bundle.tf.json line 61, in resource.databricks_pipeline.data_quality_pipelines:&lt;BR /&gt;61: }&lt;/PRE&gt;&lt;P&gt;I saw in the official documentation (&lt;A href="https://docs.databricks.com/api/workspace/pipelines/create" target="_blank" rel="noopener"&gt;Databricks API: Create Pipeline&lt;/A&gt;) that the trigger argument is deprecated. They recommend using the continuous argument, but I cannot configure the schedule with this command.&lt;/P&gt;&lt;P&gt;Does anyone know how to schedule a DLT pipeline using Databricks Asset Bundles? Should I use Databricks Workflows to orchestrate that?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 04 Feb 2025 14:08:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108795#M43143</guid>
      <dc:creator>jonhieb</dc:creator>
      <dc:date>2025-02-04T14:08:15Z</dc:date>
    </item>
    <item>
      <title>Re: [Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108796#M43144</link>
      <description>&lt;P&gt;As of now, Databricks Asset Bundles do not support direct scheduling of DLT pipelines using cron expressions within the bundle configuration. Instead, you can achieve scheduling by creating a Databricks job that triggers the DLT pipeline and then scheduling the job using the Databricks Jobs API or the Databricks UI.&lt;/P&gt;</description>
      <pubDate>Tue, 04 Feb 2025 14:08:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108796#M43144</guid>
      <dc:creator>Walter_C</dc:creator>
      <dc:date>2025-02-04T14:08:30Z</dc:date>
    </item>
    <item>
      <title>Re: [Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108799#M43145</link>
      <description>&lt;P&gt;That's worked for me. Thanks!!&lt;/P&gt;</description>
      <pubDate>Tue, 04 Feb 2025 14:21:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/108799#M43145</guid>
      <dc:creator>jonhieb</dc:creator>
      <dc:date>2025-02-04T14:21:22Z</dc:date>
    </item>
    <item>
      <title>Re: [Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/115683#M45156</link>
      <description>&lt;P&gt;So this is not possible to do so by configuring a job within a bundle using yaml ? only UI or API ?&lt;/P&gt;</description>
      <pubDate>Wed, 16 Apr 2025 18:31:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/115683#M45156</guid>
      <dc:creator>cyrillax</dc:creator>
      <dc:date>2025-04-16T18:31:34Z</dc:date>
    </item>
    <item>
      <title>Re: [Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/115684#M45157</link>
      <description>&lt;P&gt;&lt;SPAN&gt;No, you can create using Bundles. To do it, you should create a pipeline task inside of a workflow yaml file. After this, inside the yaml file schedule the entire workflow instead of the DLT pipeline directly.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 16 Apr 2025 18:59:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/115684#M45157</guid>
      <dc:creator>jonhieb</dc:creator>
      <dc:date>2025-04-16T18:59:21Z</dc:date>
    </item>
    <item>
      <title>Re: [Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/117684#M45545</link>
      <description>&lt;P&gt;Can you please provide an example how to do this?&lt;/P&gt;</description>
      <pubDate>Mon, 05 May 2025 11:09:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/117684#M45545</guid>
      <dc:creator>dr-dror</dc:creator>
      <dc:date>2025-05-05T11:09:33Z</dc:date>
    </item>
    <item>
      <title>Re: [Databricks Asset Bundles] Triggering Delta Live Tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/117693#M45548</link>
      <description>&lt;P&gt;Of course. In this example, I use the argument &lt;EM&gt;pipeline_task&amp;nbsp;&lt;/EM&gt;to reference a DLT pipeline that I created previously. This allows you to schedule your DLT pipeline inside your workflow.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# Job to orchestrate data_quality_pipelines DLT Pipeline.
resources:
  jobs:
    data_quality_pipelines_job:
      name: schedule_data_quality_job

      schedule:
        quartz_cron_expression: "0 0 8 ? * Mon" # At 8:00:00am, on Monday
        timezone_id: "America/Sao_Paulo"

      timeout_seconds: 3600  # 1 hour

      email_notifications:
        on_failure:
          - ${workspace.current_user.userName}
      webhook_notifications:
        on_failure:
          - id: ${var.webhook_id}

      tasks:
        - task_key: data_quality_task
          pipeline_task: 
            pipeline_id: ${var.input_tables_pipeline_id}
            full_refresh: false

        - task_key: output_data_quality_task
          pipeline_task:
            pipeline_id: ${var.output_tables_pipeline_id}
            full_refresh: false
        
        - task_key: notify_business_areas
          depends_on:
            - task_key: data_quality_task
            - task_key: output_data_quality_task
          run_job_task:
            job_id: ${var.send_notifications_job_id}
          
        - task_key: create_jira_tasks
          depends_on:
            - task_key: notify_business_areas
          run_job_task:
            job_id: ${var.create_jira_tasks_job_id}
          max_retries: 0

      run_as:
        user_name: xxxxxxxx@yyyyyyy
        
      parameters:
        - name: notification_conf_file
          default: ${var.notification_conf_file}&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 05 May 2025 12:02:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-asset-bundles-triggering-delta-live-tables/m-p/117693#M45548</guid>
      <dc:creator>jonhieb</dc:creator>
      <dc:date>2025-05-05T12:02:21Z</dc:date>
    </item>
  </channel>
</rss>

