<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Need to track the schema changes/column renames/column drops in Data bricks Unity Catalog in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122180#M46681</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/170539"&gt;@data_learner1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Audit logs focus on security and usage monitoring such as user access, table read/write events. They don't track schema level changes.&lt;/P&gt;&lt;P&gt;To track schema level changes, delta transaction logs will be the best to use.&amp;nbsp;The transaction log files are typically stored in a _delta_log directory within the table's root directory. Here is a sample code snippet you can use to query the log files for a table.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;df = spark.read.json("/path/to/your/delta/table/_delta_log/*.json")
df.select("commitInfo","metaData").display()&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!&lt;/P&gt;</description>
    <pubDate>Wed, 18 Jun 2025 18:14:32 GMT</pubDate>
    <dc:creator>KaranamS</dc:creator>
    <dc:date>2025-06-18T18:14:32Z</dc:date>
    <item>
      <title>Need to track the schema changes/column renames/column drops in Data bricks Unity Catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122151#M46676</link>
      <description>&lt;P&gt;Hi Team,&amp;nbsp;&lt;/P&gt;&lt;P&gt;We are getting data from third party vendor to the databricks unity Catalog. They are doing schema changes frequently and we would like to track that. Just wanted to know if I can do this using audit table on the system catalog. As we only have read permissions, we may not have an access but wanted to ask them to create a view for us. does wanted to understand if the audit table logs can help us in this scenario? please let us know if anyone of you has a better solution or how you are doing it. Please provide me some insights on this.&lt;/P&gt;&lt;P&gt;Thank you&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jun 2025 16:48:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122151#M46676</guid>
      <dc:creator>data_learner1</dc:creator>
      <dc:date>2025-06-18T16:48:01Z</dc:date>
    </item>
    <item>
      <title>Re: Need to track the schema changes/column renames/column drops in Data bricks Unity Catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122180#M46681</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/170539"&gt;@data_learner1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Audit logs focus on security and usage monitoring such as user access, table read/write events. They don't track schema level changes.&lt;/P&gt;&lt;P&gt;To track schema level changes, delta transaction logs will be the best to use.&amp;nbsp;The transaction log files are typically stored in a _delta_log directory within the table's root directory. Here is a sample code snippet you can use to query the log files for a table.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;df = spark.read.json("/path/to/your/delta/table/_delta_log/*.json")
df.select("commitInfo","metaData").display()&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jun 2025 18:14:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122180#M46681</guid>
      <dc:creator>KaranamS</dc:creator>
      <dc:date>2025-06-18T18:14:32Z</dc:date>
    </item>
    <item>
      <title>Re: Need to track the schema changes/column renames/column drops in Data bricks Unity Catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122336#M46746</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/67022"&gt;@KaranamS&lt;/a&gt;&amp;nbsp;: for suppose, i have list of tables in databricks unity catalog, with the read access I wanted to find out if they are doing any renames to the existing tables? how can we do this?&lt;/P&gt;&lt;P&gt;Thank you for your suggestions.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 20 Jun 2025 11:19:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122336#M46746</guid>
      <dc:creator>data_learner1</dc:creator>
      <dc:date>2025-06-20T11:19:29Z</dc:date>
    </item>
    <item>
      <title>Re: Need to track the schema changes/column renames/column drops in Data bricks Unity Catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122870#M46891</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/170539"&gt;@data_learner1&lt;/a&gt;&amp;nbsp;, Other alternative to track the table renames would be snapshots. You can run 'SHOW TABLES' on a catalog and save the output on daily basis and compare current vs previous to find out dropped or renamed tables&lt;/P&gt;</description>
      <pubDate>Wed, 25 Jun 2025 17:47:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122870#M46891</guid>
      <dc:creator>KaranamS</dc:creator>
      <dc:date>2025-06-25T17:47:46Z</dc:date>
    </item>
    <item>
      <title>Re: Need to track the schema changes/column renames/column drops in Data bricks Unity Catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122872#M46893</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/170539"&gt;@data_learner1&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Unity Catalog logs &lt;STRONG&gt;all data access and metadata operations&lt;/STRONG&gt; (including schema changes) into the &lt;STRONG&gt;audit logs&lt;/STRONG&gt; — which are stored in the &lt;STRONG&gt;system catalog tables&lt;/STRONG&gt;, such as:&lt;/P&gt;&lt;PRE&gt;system.access.audit&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;You mentioned you &lt;/SPAN&gt;&lt;STRONG&gt;only have read access&lt;/STRONG&gt;&lt;SPAN&gt; — and likely &lt;/SPAN&gt;&lt;STRONG&gt;no access to system.access.audit&lt;/STRONG&gt;&lt;SPAN&gt;, which is only visible to &lt;/SPAN&gt;&lt;STRONG&gt;metastore admins&lt;/STRONG&gt;&lt;SPAN&gt; or users with elevated privileges.&lt;/SPAN&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 25 Jun 2025 18:02:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/need-to-track-the-schema-changes-column-renames-column-drops-in/m-p/122872#M46893</guid>
      <dc:creator>CURIOUS_DE</dc:creator>
      <dc:date>2025-06-25T18:02:34Z</dc:date>
    </item>
  </channel>
</rss>

