โ11-20-2023 05:39 AM - edited โ11-20-2023 06:00 AM
Hi Team,
I would like to know how we can continue streaming change data feed from a delta table when its schema is changed ( non-additive schema changes like drop/rename column / schema migration ).
I came across schemaTrackingLocation in readStream but not sure how to use it for continue streaming the change feeds. I am using Delta Lake v3.0 and Databricks 14.0.
Thank you in advance.
โ11-23-2023 12:44 AM
Hi @raghav99 ,
When dealing with schema changes in Delta Lake tables, especially in a streaming context, itโs essential to understand the behaviour and implications.
Letโs address your observations and concerns:
Schema Tracking Location:
schemaTrackingLocation
is a powerful feature that allows you to enable streaming from Delta tables with column mapping enabled.schemaTrackingLocation
specified.Behavior with Schema Evolution:
Column Mapping and Schema Changes:
Streaming Behavior:
schemaTrackingLocation
helps track schema changes, it doesnโt guarantee that the stream wonโt fail during schema updates.Optimizing Schema Changes:
Final Thoughts:
Remember that schema management is critical for maintaining data consistency and reliability.
โ11-22-2023 10:46 PM
Hi @raghav99, Certainly! Streaming change data feeds from a Delta table with schema changes can be achieved using the schemaTrackingLocation option.
Letโs dive into the details:
Delta Table as a Source:
Limiting Input Rate:
Handling Schema Changes:
Streaming from Delta tables with schema changes is powerful, and with the right configuration, you can handle updates, inserts, and deletes seamlessly. ๐.
โ11-23-2023 12:36 AM - edited โ11-23-2023 12:40 AM
Thank you for the response @Kaniz_Fatma .
So lets consider one or many schema changes per table. It can be combinations of drop , rename column , add new column and updating column's datatype, etc.
I am using schemaTrackingLocation and have observed stream breaking with StreamingQueryException.
It updates the schema tracking log but since it has multiple schema changes , I have to rerun the stream multiple times to keep on updating the schema tracking maybe because there are lots of schema updates happened on the table while doing these migration combinations.
I am not sure how many times I need to re-trigger the stream to get it to final updated schema on my schemaTrackingLocation.
So is this an expected behaviour or a bug ?
I was under impression looking at the docs that with schemaTrackingLocation enabled , stream doesn't fail and it will keep on updating and picking up the right schema from source table.
โ11-23-2023 12:44 AM
Hi @raghav99 ,
When dealing with schema changes in Delta Lake tables, especially in a streaming context, itโs essential to understand the behaviour and implications.
Letโs address your observations and concerns:
Schema Tracking Location:
schemaTrackingLocation
is a powerful feature that allows you to enable streaming from Delta tables with column mapping enabled.schemaTrackingLocation
specified.Behavior with Schema Evolution:
Column Mapping and Schema Changes:
Streaming Behavior:
schemaTrackingLocation
helps track schema changes, it doesnโt guarantee that the stream wonโt fail during schema updates.Optimizing Schema Changes:
Final Thoughts:
Remember that schema management is critical for maintaining data consistency and reliability.
โ11-29-2023 07:14 AM
I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group