Hello @liquibricks
Yes, it is a known and expected behaviour that changing the comment value passed to the @DP.table decorator (or similar DLT table decorators) does not automatically update the physical table's comment in the underlying catalogue when the pipeline is restarted or updated, if the table already exists. Once a streaming table is created, metadata set at creation such as the table comment, remains attached to the table unless explicitly changed using supported metadata-altering operations
In SDP/DLT streaming tables, Databricks blocks direct DDL changes to schema/properties (including table comments) outside the pipeline to maintain metadata consistency with the pipeline definition. If you try COMMENT ON TABLE, Databricks throws an error and tells you to use CREATE OR REFRESH instead.
Doc: https://kb.databricks.com/en_US/delta-live-tables/unable-to-modify-comments-on-streaming-tables
So the only way to fix this is to update the comment in the pipeline code instead then run a pipeline update for it two work
Anudeep