<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Adding comments to Streaming Tables created with SQL Server Data Ingestion in Data Governance</title>
    <link>https://community.databricks.com/t5/data-governance/adding-comments-to-streaming-tables-created-with-sql-server-data/m-p/135180#M2637</link>
    <description>&lt;P&gt;I have been tasked with governing the data within our Databricks instance. A large part of this is adding Comments or Descriptions, and Tags to our Schemas, Tables and Columns in Unity Catalog.&lt;/P&gt;&lt;P&gt;For most objects this has been straight-forward, but one place where I'm running into issues is in adding Comments or Descriptions to Streaming Tables that were created through the SQL Server Data Ingestion "Wizard", described here: &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-pipeline" target="_blank" rel="noopener"&gt;Ingest data from SQL Server - Azure Databricks | Microsoft Learn&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;All documentation I have read about adding comments to Streaming Tables mentions adding the Comments to the Lakeflow Declarative Pipelines directly, which would work if we were creating our Lakeflow Declarative Pipelines through Notebooks and ETL Pipelines.&lt;/P&gt;&lt;P&gt;Does anyone know of a way to add these Comments? I see no options through the Data Ingestion UI or the Jobs &amp;amp; Pipelines UI.&lt;/P&gt;&lt;P&gt;Note: we did look into adding Comments and Tags through DDL commands and we managed to set up some Column Comments and Tags through this approach but the Comments did not persist, and we aren't sure if the Tags will persist.&lt;/P&gt;</description>
    <pubDate>Fri, 17 Oct 2025 01:57:56 GMT</pubDate>
    <dc:creator>pdg27</dc:creator>
    <dc:date>2025-10-17T01:57:56Z</dc:date>
    <item>
      <title>Adding comments to Streaming Tables created with SQL Server Data Ingestion</title>
      <link>https://community.databricks.com/t5/data-governance/adding-comments-to-streaming-tables-created-with-sql-server-data/m-p/135180#M2637</link>
      <description>&lt;P&gt;I have been tasked with governing the data within our Databricks instance. A large part of this is adding Comments or Descriptions, and Tags to our Schemas, Tables and Columns in Unity Catalog.&lt;/P&gt;&lt;P&gt;For most objects this has been straight-forward, but one place where I'm running into issues is in adding Comments or Descriptions to Streaming Tables that were created through the SQL Server Data Ingestion "Wizard", described here: &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-pipeline" target="_blank" rel="noopener"&gt;Ingest data from SQL Server - Azure Databricks | Microsoft Learn&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;All documentation I have read about adding comments to Streaming Tables mentions adding the Comments to the Lakeflow Declarative Pipelines directly, which would work if we were creating our Lakeflow Declarative Pipelines through Notebooks and ETL Pipelines.&lt;/P&gt;&lt;P&gt;Does anyone know of a way to add these Comments? I see no options through the Data Ingestion UI or the Jobs &amp;amp; Pipelines UI.&lt;/P&gt;&lt;P&gt;Note: we did look into adding Comments and Tags through DDL commands and we managed to set up some Column Comments and Tags through this approach but the Comments did not persist, and we aren't sure if the Tags will persist.&lt;/P&gt;</description>
      <pubDate>Fri, 17 Oct 2025 01:57:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/adding-comments-to-streaming-tables-created-with-sql-server-data/m-p/135180#M2637</guid>
      <dc:creator>pdg27</dc:creator>
      <dc:date>2025-10-17T01:57:56Z</dc:date>
    </item>
    <item>
      <title>Re: Adding comments to Streaming Tables created with SQL Server Data Ingestion</title>
      <link>https://community.databricks.com/t5/data-governance/adding-comments-to-streaming-tables-created-with-sql-server-data/m-p/135755#M2643</link>
      <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;It is currently not possible to reliably add or persist comments or descriptions directly to Streaming Tables created via the SQL Server Data Ingestion Wizard in Databricks using the Data Ingestion UI or Jobs &amp;amp; Pipelines UI. All metadata management—including comments and tags—for Lakeflow Streaming Tables is expected to be handled within the Lakeflow Declarative Pipeline definitions themselves, or programmatically via code in Notebooks and ETL pipelines.​&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;DDL and UI Approaches&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Attempts to add comments or tags to these Streaming Tables using Databricks SQL DDL commands (“COMMENT ON TABLE” or “ALTER TABLE … SET TAG”) may appear to work at first, but typically do not persist after pipeline executions or refresh cycles. This is because the pipeline re-creates tables during refreshes, discarding any manually set metadata not present in the pipeline definition.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Similarly, attempting to add or edit comments through the Catalog Explorer UI is unreliable for Streaming Tables if underlying pipeline-managed metadata is not updated or if required cluster settings (such as&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;spark.databricks.delta.catalog.update.enabled&lt;/CODE&gt;) are not correctly set. For regular Delta or Unity Catalog tables, ensure that this setting is true to propagate comment metadata to the catalog UI, but it does not override pipeline-managed streaming tables.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Recommendation &amp;amp; Workarounds&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Lakeflow Pipeline Code:&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;The only permanent way to add Comments or Descriptions to Streaming Tables is to modify the Lakeflow Declarative Pipeline code itself, adding comments at table creation or within the DLT definitions (e.g., using the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;comment&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;argument in&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/97035"&gt;@Dlt&lt;/a&gt;.table&lt;/CODE&gt;). This requires exporting and editing pipeline code, which is not exposed in the Wizard-driven ingestion approach.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Tags:&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;Tags can sometimes be added via DDL, but their persistence depends on pipeline behavior. If the pipeline overwrites the table, manually applied tags may also be lost unless set in the pipeline definition.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Governance Strategy:&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;For environments requiring robust, persistent governance metadata, migrate from Wizard-based ingestion to code-defined Lakeflow pipelines. This gives full&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 22 Oct 2025 17:45:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/adding-comments-to-streaming-tables-created-with-sql-server-data/m-p/135755#M2643</guid>
      <dc:creator>mark_ott</dc:creator>
      <dc:date>2025-10-22T17:45:51Z</dc:date>
    </item>
  </channel>
</rss>

