<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks Database synced tables in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-database-synced-tables/m-p/154142#M54068</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/226577"&gt;@prakharsachan&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;synced_database_table creation assumes the Unity Catalog source table referenced in spec.source_table_full_name already exists and is readable. The API treats this as the source table to sync from, and if it can’t be read, you’ll see errors like SOURCE_READ_ERROR or TABLE_DOES_NOT_EXIST from the synced table pipeline. In practice, that means you must materialise the source UC table (for example, by running the Lakeflow pipeline once) before creating the synced_database_table resource in a bundle.&lt;/P&gt;
&lt;P&gt;Because bundles only define and deploy resources (they don’t run your pipelines to materialize data), there isn’t a one-shot way to create pipeline... run pipeline... and then create synced table in a single DAB deploy today.&lt;/P&gt;
&lt;P&gt;The workaround will be a two stage deployment.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In stage 1, deploy and run the source pipelines.&amp;nbsp;Bundle contains the Lakeflow pipelines that build the UC source tables (no synced_database_tables yet).&amp;nbsp;Databricks bundle deploy, then trigger those pipelines (manually, via Jobs, or CI) so the Delta/UC tables actually exist.&lt;/P&gt;
&lt;P&gt;In stage 2, add synced tables to the bundle. This is where you extend the same bundle to add resources.synced_database_tables pointing to those now-existing source_table_full_name tables.&amp;nbsp;Deploy again.&amp;nbsp;create_synced_database_table now succeeds because validation can read the source tables.&lt;/P&gt;
&lt;P&gt;This is the same pattern Databricks recommends for other resources that cannot be referenced until they exist, like UC volumes... first deploy the volume, then reference it (for example in artifact_path) in subsequent deployments.&lt;/P&gt;
&lt;P&gt;Hope this helps.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
    <pubDate>Sat, 11 Apr 2026 20:30:33 GMT</pubDate>
    <dc:creator>Ashwin_DSA</dc:creator>
    <dc:date>2026-04-11T20:30:33Z</dc:date>
    <item>
      <title>Databricks Database synced tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-database-synced-tables/m-p/154090#M54065</link>
      <description>&lt;P&gt;When I am deploying synced tables and the pipelines which create the source tables(used by synced tables) using DABs for the first time, the error occurs that the source tables doesnt exist (yes because the pipeline hasnt ran yet), then whats the workaround for this?&lt;/P&gt;</description>
      <pubDate>Fri, 10 Apr 2026 16:30:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-database-synced-tables/m-p/154090#M54065</guid>
      <dc:creator>prakharsachan</dc:creator>
      <dc:date>2026-04-10T16:30:04Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Database synced tables</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-database-synced-tables/m-p/154142#M54068</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/226577"&gt;@prakharsachan&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;synced_database_table creation assumes the Unity Catalog source table referenced in spec.source_table_full_name already exists and is readable. The API treats this as the source table to sync from, and if it can’t be read, you’ll see errors like SOURCE_READ_ERROR or TABLE_DOES_NOT_EXIST from the synced table pipeline. In practice, that means you must materialise the source UC table (for example, by running the Lakeflow pipeline once) before creating the synced_database_table resource in a bundle.&lt;/P&gt;
&lt;P&gt;Because bundles only define and deploy resources (they don’t run your pipelines to materialize data), there isn’t a one-shot way to create pipeline... run pipeline... and then create synced table in a single DAB deploy today.&lt;/P&gt;
&lt;P&gt;The workaround will be a two stage deployment.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In stage 1, deploy and run the source pipelines.&amp;nbsp;Bundle contains the Lakeflow pipelines that build the UC source tables (no synced_database_tables yet).&amp;nbsp;Databricks bundle deploy, then trigger those pipelines (manually, via Jobs, or CI) so the Delta/UC tables actually exist.&lt;/P&gt;
&lt;P&gt;In stage 2, add synced tables to the bundle. This is where you extend the same bundle to add resources.synced_database_tables pointing to those now-existing source_table_full_name tables.&amp;nbsp;Deploy again.&amp;nbsp;create_synced_database_table now succeeds because validation can read the source tables.&lt;/P&gt;
&lt;P&gt;This is the same pattern Databricks recommends for other resources that cannot be referenced until they exist, like UC volumes... first deploy the volume, then reference it (for example in artifact_path) in subsequent deployments.&lt;/P&gt;
&lt;P&gt;Hope this helps.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 11 Apr 2026 20:30:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-database-synced-tables/m-p/154142#M54068</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-11T20:30:33Z</dc:date>
    </item>
  </channel>
</rss>

