cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

ignoreDeletes' option with Delta Live Table streaming source

Zachary_Higgins
Contributor

We have a delta streaming source in our delta live table pipelines that may have data deleted from time to time.

The error message is pretty self explanatory:

...from streaming source at version 191. This is currently not supported. If you'd like to ignore deletes, set the option 'ignoreDeletes' to 'true'.

What's not clear is how to set this option. This is what we have now but it's not producing the desired results. The desired result being new data is read and deletes are ignored.

SET pipelines.ignoreDeletes = true;
CREATE OR REFRESH STREAMING LIVE TABLE...

How should this option be set in a delta live table?

6 REPLIES 6

Hi - Thanks for the response. Does your suggestion work with Delta live tables when you try it? This seems to produce the same error message when I use the code below:

@dlt.table(
   ...
    })
 
def table_fnc():
    return spark.readStream.format("delta").option("ignoreDeletes", "true").table("tablename")

I'm not worried about duplicates. I just want to stream out the tables current state and append it to a sink in my DLT pipeline. As far as I know, DLT can't just append data from a source unless it's streamed in...

I haven't heard back, but the response above was copy and pasted from here: Table streaming reads and writes | Databricks on AWS

We decided to just move these tables to a true structured stream. We hope that DLT can support simple appends later on.

@Kaniz Fatmaโ€‹ - Has Databricks found a way to prune unwanted records from a source without requiring the entire sink table be recalculated with DLT?

JohnA
New Contributor III

@Kaniz Fatmaโ€‹ Hi Kaniz, can we please circle around to this? Like @Zachary Higginsโ€‹ , I am unsure how to set the ignoreDeletes or ignoreChanges spark.sql configuration for my Delta Live Table Pipeline defined in SQL.

Thanks

7effrey
New Contributor III

Databricks, please provide an answer to this. It seems like there is no documentation on how delta live tables support table updates. The ignoreChanges is bound to spark.readstream method which is not made to dlt.read_stream

Michael42
New Contributor III

I'd am looking at this as well and would like to understand my options here.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group