<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Does Lakeflow Connect Have Any Change Tracking Diagnostics? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155270#M54223</link>
    <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/216690"&gt;@Ashwin_DSA&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/226887"&gt;@amirabedhiafi&lt;/a&gt;&amp;nbsp;for your swift responses.&lt;/P&gt;&lt;P&gt;I had high hopes when I saw&amp;nbsp;&lt;SPAN&gt;lakeflowUtilityVersion_1_5() is queried, as I found the database user&amp;nbsp;for the ingestion gateway connection (i.e. the @User parameter for both dbo.lakeflowSetupChangeTracking and&amp;nbsp;dbo.lakeflowFixPermissions) DIDN'T have execute permission on that function (&lt;EM&gt;clearly an oversight in lakeflowFixPermissions which Databricks ought to address&lt;/EM&gt;).&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;But after granting execute permission we are still getting exactly the same error.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;And our setup meets all the other requirements you have mentioned between you.&lt;/P&gt;&lt;P&gt;The one thing that you both mention that doesn't make sense to me is the need for the trigger to "&lt;SPAN&gt;actually be writing rows" to the audit table. I realise that for DDL changes to be picked up this needs to be operational, but surely this can't be necessary for the ingestion to work &lt;U&gt;&lt;STRONG&gt;at all&lt;/STRONG&gt;&lt;/U&gt;. Isn't it perfectly normal/acceptable for the audit table to be empty? We are attempting to ingest a stable source that hasn't experienced any ALTER TABLE operations yet so the trigger won't have fired. Or am I missing something?&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 23 Apr 2026 05:30:14 GMT</pubDate>
    <dc:creator>cvh</dc:creator>
    <dc:date>2026-04-23T05:30:14Z</dc:date>
    <item>
      <title>Does Lakeflow Connect Have Any Change Tracking Diagnostics?</title>
      <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155166#M54200</link>
      <description>&lt;P&gt;&lt;SPAN&gt;We have set up Change Tracking on multiple SQL Servers for Lakeflow Connect successfully in the past, but lately we are having lots of problems with a couple of servers. The latest utility script has been run and both&amp;nbsp;lakeflowSetupChangeTracking and&amp;nbsp;lakeflowFixPermissions have been executed several times. The database trigger&amp;nbsp;[lakeflowDdlAuditTrigger_1_5] is enabled, CT has been enabled on the database (checked sys.change_tracking_databases) and on the tables we want to ingest (checked sys.change_tracking_tables). We can see the ddl audit table (dbo.[lakeflowDdlAudit_1_5]) is also present, but when we try to run the ingestion pipeline we get ...&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN&gt;INGESTION_GATEWAY_DDL_OBJECTS_MISSING&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN&gt;DDL objects missing on table '&amp;lt;TABLE&amp;gt;'. Execute the DDL objects script and full refresh the table on the Ingestion Pipeline&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN&gt;Reason: - Catalog is not properly configured to capture DDL changes for the provided set of tables. How to fix: - To capture DDL changes for a table with LakeFlow Connect, appropriate set of support objects must be setup on the catalog. Different support objects are needed depending if tables that are replicated are using CT and/or CDC extraction mechanism. - Investigation shows that only CT tables are included in the replication, but corresponding DDL support object do not exist in the catalog.&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;&lt;SPAN&gt;We also found this in the ingestion gateway error logging, but it still doesn't identify &lt;STRONG&gt;&lt;U&gt;specifically&lt;/U&gt; &lt;/STRONG&gt;where the problem is:&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN&gt;ddlCaptureNotEnabledTables={&amp;lt;TABLE&amp;gt;= Reason: - Catalog is not properly configured to capture DDL changes for the provided set of tables. How to fix: - To capture DDL changes for a table with LakeFlow Connect, appropriate set of support objects must be setup on the catalog. Different support objects are needed depending if tables that are replicated are using CT and/or CDC extraction mechanism. - Investigation shows that only CT tables are included in the replication, but corresponding DDL support object do not exist in the catalog.&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Is there anything that we can run on the SQL Server to give us more information about the specific problem? How does Lakeflow Connect determine that DDL has not been configured? When the documentation has been followed - and yet we still get this error - where to from here?&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2026 05:52:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155166#M54200</guid>
      <dc:creator>cvh</dc:creator>
      <dc:date>2026-04-22T05:52:31Z</dc:date>
    </item>
    <item>
      <title>Re: Does Lakeflow Connect Have Any Change Tracking Diagnostics?</title>
      <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155178#M54204</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/192127"&gt;@cvh&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Having done some research, I have found that for SQL Server CT, Lakeflow expects the following (per DB/catalog)..&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Utility objects at version 1.5: dbo.lakeflowUtilityVersion_1_5() = '1.5'.&lt;/LI&gt;
&lt;LI&gt;CT DDL objects present in the same database as the CT tables:
&lt;UL&gt;
&lt;LI&gt;Table: dbo.lakeflowDdlAudit_1_5&lt;/LI&gt;
&lt;LI&gt;Database DDL trigger: lakeflowDdlAuditTrigger_1_5 (enabled)&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;The ingestion user can read from dbo.lakeflowDdlAudit_1_5.&lt;/LI&gt;
&lt;LI&gt;The DDL trigger is actually writing rows for CT tables (i.e., DDL is being captured).&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;If any of these fail, the gateway reports INGESTION_GATEWAY_DDL_OBJECTS_MISSING and logs ddlCaptureNotEnabledTables.&lt;/P&gt;
&lt;P&gt;Does your setup meet all the above requirements? Do you know how to check these, or do you need help with that?&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2026 08:58:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155178#M54204</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-22T08:58:28Z</dc:date>
    </item>
    <item>
      <title>Re: Does Lakeflow Connect Have Any Change Tracking Diagnostics?</title>
      <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155192#M54205</link>
      <description>&lt;P&gt;I think you need to check 2 more things that are easy to miss.&lt;/P&gt;&lt;P&gt;The ingestion user must have all required metadata permissions, not just access to the source tables, including sys.change_tracking_tables, sys.change_tracking_databases, sys.objects, sys.triggers, and VIEW DATABASE PERFORMANCE STATE&lt;BR /&gt;&lt;BR /&gt;And the DDL audit trigger must actually be writing rows into dbo.lakeflowDdlAudit_1_5 for CT tables. Databricks validates DDL support, CT/CDC state, utility objects, and permissions separately, so object existence alone is not sufficient.&lt;/P&gt;&lt;P&gt;If you upgraded the utility script you need to verify that the gateway was stopped first and the setup procedure was rerun with the same parameters as the original config.&lt;/P&gt;&lt;P&gt;Once you fix the issue don't forger to make full refresh for the affected table.&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2026 11:51:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155192#M54205</guid>
      <dc:creator>amirabedhiafi</dc:creator>
      <dc:date>2026-04-22T11:51:28Z</dc:date>
    </item>
    <item>
      <title>Re: Does Lakeflow Connect Have Any Change Tracking Diagnostics?</title>
      <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155270#M54223</link>
      <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/216690"&gt;@Ashwin_DSA&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/226887"&gt;@amirabedhiafi&lt;/a&gt;&amp;nbsp;for your swift responses.&lt;/P&gt;&lt;P&gt;I had high hopes when I saw&amp;nbsp;&lt;SPAN&gt;lakeflowUtilityVersion_1_5() is queried, as I found the database user&amp;nbsp;for the ingestion gateway connection (i.e. the @User parameter for both dbo.lakeflowSetupChangeTracking and&amp;nbsp;dbo.lakeflowFixPermissions) DIDN'T have execute permission on that function (&lt;EM&gt;clearly an oversight in lakeflowFixPermissions which Databricks ought to address&lt;/EM&gt;).&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;But after granting execute permission we are still getting exactly the same error.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;And our setup meets all the other requirements you have mentioned between you.&lt;/P&gt;&lt;P&gt;The one thing that you both mention that doesn't make sense to me is the need for the trigger to "&lt;SPAN&gt;actually be writing rows" to the audit table. I realise that for DDL changes to be picked up this needs to be operational, but surely this can't be necessary for the ingestion to work &lt;U&gt;&lt;STRONG&gt;at all&lt;/STRONG&gt;&lt;/U&gt;. Isn't it perfectly normal/acceptable for the audit table to be empty? We are attempting to ingest a stable source that hasn't experienced any ALTER TABLE operations yet so the trigger won't have fired. Or am I missing something?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 23 Apr 2026 05:30:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155270#M54223</guid>
      <dc:creator>cvh</dc:creator>
      <dc:date>2026-04-23T05:30:14Z</dc:date>
    </item>
    <item>
      <title>Re: Does Lakeflow Connect Have Any Change Tracking Diagnostics?</title>
      <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155317#M54226</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&amp;nbsp;!&lt;/P&gt;&lt;P&gt;Now I can better understand you and you are not missing anything there.&lt;/P&gt;&lt;P&gt;An empty dbo.lakeflowDdlAudit_1_5 table should be perfectly normal on a stable source that has not had any ALTER TABLE activity yet. In DBKS for CT the DDL support objects are the audit table plus the database DDL trigger and that the trigger’ role is to capture ALTER TABLE events. That implies the table is a history table for schema changes not a table that must be prepopulated for ingestion to start. Well I couldn't find&amp;nbsp;anything in the current docs saying the audit table must already contain rows before a pipeline can run.&lt;/P&gt;&lt;P&gt;So I would correct that earlier point and to resume the&amp;nbsp;required is&amp;nbsp;the CT DDL support objects exist and are correctly configured and&amp;nbsp;not inherently required is&amp;nbsp;that the audit table already has entries.&lt;/P&gt;&lt;P&gt;Also the fact that lakeflowUtilityVersion_1_5() lacked execute permission is where I think it explains part of the situation but I would not expect that alone to explain this exact error because DBKS has a separate insufficient permissions error class for source side permission problems. If the gateway were failing specifically because it could not execute an object, I would expect something closer to INGESTION_GATEWAY_SOURCE_INSUFFICIENT_PERMISSION_FAILURE&amp;nbsp;not INGESTION_GATEWAY_DDL_OBJECTS_MISSING.&lt;/P&gt;&lt;P&gt;What I think is more likely is one of these:&lt;/P&gt;</description>
      <pubDate>Thu, 23 Apr 2026 10:50:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155317#M54226</guid>
      <dc:creator>amirabedhiafi</dc:creator>
      <dc:date>2026-04-23T10:50:27Z</dc:date>
    </item>
    <item>
      <title>Re: Does Lakeflow Connect Have Any Change Tracking Diagnostics?</title>
      <link>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155333#M54228</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/192127"&gt;@cvh&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Fair point. The writing point was to prove that the permissions work, and it is not a precondition.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Can you connect to SQL Server using the same login/user the gateway uses, and run these statements?&lt;/P&gt;
&lt;LI-CODE lang="python"&gt;-- 1) Utility functions
SELECT dbo.lakeflowUtilityVersion_1_5() AS utility_version;
SELECT dbo.lakeflowDetectPlatform()     AS platform;

-- 2) DDL audit table access
SELECT TOP (1) * FROM dbo.lakeflowDdlAudit_1_5;

-- 3) CT enabled on this database
SELECT ctdb.*
FROM sys.change_tracking_databases ctdb
JOIN sys.databases db ON db.database_id = ctdb.database_id
WHERE db.name = DB_NAME();

-- 4) CT enabled on one of the failing tables
SELECT s.name AS schema_name, t.name AS table_name, ct.*
FROM sys.change_tracking_tables ct
JOIN sys.tables  t ON ct.object_id = t.object_id
JOIN sys.schemas s ON t.schema_id = s.schema_id
WHERE s.name = '&amp;lt;schema&amp;gt;' AND t.name = '&amp;lt;table&amp;gt;';&lt;/LI-CODE&gt;
&lt;P&gt;Do&amp;nbsp;any of these fail with a permission error?&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 23 Apr 2026 12:18:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/does-lakeflow-connect-have-any-change-tracking-diagnostics/m-p/155333#M54228</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-23T12:18:44Z</dc:date>
    </item>
  </channel>
</rss>

