<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Moving existing Delta Live Table to Asset Bundle in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100101#M40184</link>
    <description>&lt;P&gt;I have done these steps, but my DLT still took a long time to process. However, the path to the notebook with my pipeline logic has changed because I am deploying it as a bundle. Is this a problem? Also, the name of the pipeline changed because of a prefix I added in the Asset Bundle.&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 26 Nov 2024 14:19:54 GMT</pubDate>
    <dc:creator>Isa1</dc:creator>
    <dc:date>2024-11-26T14:19:54Z</dc:date>
    <item>
      <title>Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100068#M40180</link>
      <description>&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;I am creating an Asset Bundle, which also includes my streaming Delta Live Table Pipelines. I want to move these DLT pipelines to the Asset Bundle, without having to run my DLT streaming Pipeline on all historical files (this takes a lot of compute and time). Is there a way to migrate an existing DLT Pipeline to Asset Bundles?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 12:14:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100068#M40180</guid>
      <dc:creator>Isa1</dc:creator>
      <dc:date>2024-11-26T12:14:46Z</dc:date>
    </item>
    <item>
      <title>Re: Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100098#M40182</link>
      <description>&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;SPAN&gt;Yes, you can migrate an existing Delta Live Table (DLT) pipeline to an Asset Bundle without having to reprocess all historical files. Here are the steps to achieve this:&lt;/SPAN&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;SPAN&gt;&lt;STRONG&gt;Create a Databricks Asset Bundle&lt;/STRONG&gt;: Use the Databricks CLI to initialize a new bundle. This will create a &lt;CODE&gt;databricks.yml&lt;/CODE&gt; file in the root of your project, which will be used to define your Databricks resources, including your DLT pipelines.&lt;/SPAN&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;SPAN&gt;&lt;STRONG&gt;Define the DLT Pipeline in the Bundle&lt;/STRONG&gt;: In the &lt;CODE&gt;databricks.yml&lt;/CODE&gt; file, you will need to define your DLT pipeline. This involves specifying the pipeline's configuration, such as the path to the notebook or script that defines the pipeline logic.&lt;/SPAN&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;SPAN&gt;&lt;STRONG&gt;Deploy the Bundle&lt;/STRONG&gt;: Use the Databricks CLI to deploy the bundle to your target environment. This will create the necessary resources in your Databricks workspace based on the definitions in the &lt;CODE&gt;databricks.yml&lt;/CODE&gt; file.&lt;/SPAN&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;SPAN&gt;&lt;STRONG&gt;Run the Pipeline&lt;/STRONG&gt;: Once the bundle is deployed, you can run the DLT pipeline from the Databricks extension panel or using the CLI. This will start the pipeline without reprocessing all historical files, as the pipeline will continue from its last processed state.&lt;/SPAN&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;</description>
      <pubDate>Tue, 26 Nov 2024 14:12:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100098#M40182</guid>
      <dc:creator>Walter_C</dc:creator>
      <dc:date>2024-11-26T14:12:43Z</dc:date>
    </item>
    <item>
      <title>Re: Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100100#M40183</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/133860"&gt;@Isa1&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;f you have existing pipelines that were created using the Databricks user interface or API that you want to move to bundles, you must define them in a bundle’s configuration files. Databricks recommends that you first create a bundle using the steps below and then validate whether the bundle works. You can then add additional definitions, notebooks, and other sources to the bundle.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;You can follow official documentation entry. Just repeat steps:&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://docs.databricks.com/en/dev-tools/bundles/pipelines-tutorial.html#add-an-existing-pipeline-definition-to-a-bundle" target="_blank" rel="noopener"&gt;Develop Delta Live Tables pipelines with Databricks Asset Bundles | Databricks on AWS&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 14:17:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100100#M40183</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-11-26T14:17:44Z</dc:date>
    </item>
    <item>
      <title>Re: Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100101#M40184</link>
      <description>&lt;P&gt;I have done these steps, but my DLT still took a long time to process. However, the path to the notebook with my pipeline logic has changed because I am deploying it as a bundle. Is this a problem? Also, the name of the pipeline changed because of a prefix I added in the Asset Bundle.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 14:19:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100101#M40184</guid>
      <dc:creator>Isa1</dc:creator>
      <dc:date>2024-11-26T14:19:54Z</dc:date>
    </item>
    <item>
      <title>Re: Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100104#M40186</link>
      <description>&lt;P&gt;When you change the path to the notebook or the name of the pipeline in your Delta Live Table (DLT) pipeline, it can indeed cause issues. Specifically, changing the path to the notebook or the name of the pipeline can lead to the recreation of the pipeline.&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 14:23:45 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100104#M40186</guid>
      <dc:creator>Walter_C</dc:creator>
      <dc:date>2024-11-26T14:23:45Z</dc:date>
    </item>
    <item>
      <title>Re: Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100105#M40187</link>
      <description>&lt;P&gt;So maybe try to use bind command? This command allows to link&amp;nbsp;&lt;SPAN&gt;bundle-defined jobs and pipelines to existing jobs and pipelines in the Databricks workspace so that they become managed by Databricks Asset Bundles&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://docs.gcp.databricks.com/en/dev-tools/cli/bundle-commands.html#bind-bundle-resources" target="_blank" rel="noopener"&gt;https://docs.gcp.databricks.com/en/dev-tools/cli/bundle-commands.html#bind-bundle-resourcesbundle command&lt;BR /&gt;&lt;BR /&gt;&lt;/A&gt;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;databricks bundle deployment bind [resource-key] [resource-id]&lt;/LI-CODE&gt;&lt;P&gt;&lt;A href="https://docs.gcp.databricks.com/en/dev-tools/cli/bundle-commands.html#bind-bundle-resources" target="_blank" rel="noopener"&gt;&lt;BR /&gt;group | Databricks on Google Cloud&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 14:31:20 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100105#M40187</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-11-26T14:31:20Z</dc:date>
    </item>
    <item>
      <title>Re: Moving existing Delta Live Table to Asset Bundle</title>
      <link>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100110#M40189</link>
      <description>&lt;P&gt;And to add one thing, in&lt;SPAN&gt;&amp;nbsp;Delta Live Tables checkpoints are stored under the storage location specified in the DLT settings. Each table gets a dedicated directory under&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;storage_location/checkpoints/&amp;lt;dlt_table_name. So if you would like to avoid running your pipeline from the start you need to use bind command, because otherwise new pipeline name will create new checkpoint directory.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 14:38:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/moving-existing-delta-live-table-to-asset-bundle/m-p/100110#M40189</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-11-26T14:38:30Z</dc:date>
    </item>
  </channel>
</rss>

