cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DLT schema evolution/changes in the logs

ankit001mittal
New Contributor III

Hi all,

I want to figure out how to find when the schema evolution/changes are happening in the objects in DLT pipelines through the DLT logs.
Could you please share some sample DLT logs which explains about the schema changes?

Thank you for your help.

1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

To find when schema evolution or changes are happening in objects within DLT (Delta Live Table) pipelines, you need to monitor certain entries within the DLT logs or Delta transaction logs that signal modifications to the underlying schema of a table, such as adding or removing columns, changing field types, or nesting structure changes. Here are sample patterns and examples of what to look for in DLT logs to identify schema changes:

Sample DLT Log Entries Showing Schema Changes

  • When columns are added, removed, or their types are changed, DLT logs or Delta logs can include:

    • Events typed as SCHEMA_CHANGE, ALTER_TABLE, or with operation parameters indicating schema modifications.

    • A metadata field or message describing an updated schema, often presented in JSON format.

Example 1: Detect Column Additions in DLT Logs

A schema change where a new column Sport is added:

text
{ "timestamp": "2025-10-22T14:00:00Z", "eventType": "SCHEMA_CHANGE", "operation": "ADD_COLUMN", "object": "Athletes", "details": { "added": ["Sport"], "previousSchema": ["Name", "Country"], "newSchema": ["Name", "Country", "Sport"] } }

This log entry states the change, object, what was added, and the before/after schema. A similar entry will appear for column removals or type changes.โ€‹

Example 2: Delta Log JSON Sampling

DLT leverages underlying Delta transaction logs. Schema changes show up in these logs:

text
spark.read.json("dbfs:/tmp/delta/table/_delta_log/*.json").createOrReplaceTempView("delta_log") display(spark.sql("select metadata.schemaString, input_file_name() from delta_log where metadata is not null"))

You can query for entries where metadata.schemaString changes across transactionsโ€”these show when any schema change occurred.โ€‹

Example 3: Nested Data Schema Evolution

When the structure changes (e.g., renaming nested objects or modifying nested data types), look for fields in the log such as:

text
{ "eventType": "SCHEMA_EVOLUTION", "object": "Organization", "details": { "renamed": {"building": "main_block"}, "removed": ["room"], "changedType": {"inventory_nr": "string"} } }

DLT logs can produce these entries when pipeline definitions encounter new data structures or when contracts enable schema evolution.โ€‹

Example 4: DLT PlusLogCollector or Notification

Some observability frameworks on top of DLT (like PlusLogCollector) can generate alerts:

text
[INFO] 2025-10-22 13:05:02 Pipeline detected schema change: column 'score' added to table 'Scores'

Or:

text
send_slack_message("Schema change detected: New column 'ceo' in 'org'")

These log lines are useful for trigger-based notification of schema changes.โ€‹

How to Extract Schema Change Information

  • Query the Delta transaction logs using Spark as shown above and monitor differences in metadata.schemaString.

  • Monitor for specific DLT pipeline log events tagged with schema-related modifications.

  • Use observability tools to catch and alert on schema change logs during pipeline runs.

These patterns and snippets illustrate typical DLT log entries that highlight schema evolution and how changes are recorded and can be monitored for reporting or alerting purposes in modern ETL and DLT workflows.