cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

ignoreDeletes' option with Delta Live Table streaming source

Zachary_Higgins
Contributor

We have a delta streaming source in our delta live table pipelines that may have data deleted from time to time.

The error message is pretty self explanatory:

...from streaming source at version 191. This is currently not supported. If you'd like to ignore deletes, set the option 'ignoreDeletes' to 'true'.

What's not clear is how to set this option. This is what we have now but it's not producing the desired results. The desired result being new data is read and deletes are ignored.

SET pipelines.ignoreDeletes = true;
CREATE OR REFRESH STREAMING LIVE TABLE...

How should this option be set in a delta live table?

8 REPLIES 8

Hi - Thanks for the response. Does your suggestion work with Delta live tables when you try it? This seems to produce the same error message when I use the code below:

@dlt.table(
   ...
    })
 
def table_fnc():
    return spark.readStream.format("delta").option("ignoreDeletes", "true").table("tablename")

I'm not worried about duplicates. I just want to stream out the tables current state and append it to a sink in my DLT pipeline. As far as I know, DLT can't just append data from a source unless it's streamed in...

I haven't heard back, but the response above was copy and pasted from here: Table streaming reads and writes | Databricks on AWS

We decided to just move these tables to a true structured stream. We hope that DLT can support simple appends later on.

@Kaniz Fatma​ - Has Databricks found a way to prune unwanted records from a source without requiring the entire sink table be recalculated with DLT?

JohnA
New Contributor III

@Kaniz Fatma​ Hi Kaniz, can we please circle around to this? Like @Zachary Higgins​ , I am unsure how to set the ignoreDeletes or ignoreChanges spark.sql configuration for my Delta Live Table Pipeline defined in SQL.

Thanks

7effrey
New Contributor III

Databricks, please provide an answer to this. It seems like there is no documentation on how delta live tables support table updates. The ignoreChanges is bound to spark.readstream method which is not made to dlt.read_stream

Michael42
New Contributor III

I'd am looking at this as well and would like to understand my options here.

yegorski
New Contributor III

We had to delete some records in the destination tables created by the DLT pipeline today and hit this error. What resolved it was using spark.readStream and then those options could be set. Can't set them on `dlt.read_stream`

 

df = dlt.read_stream("request_params")
    # df = dlt.read_stream("request_params")
    df = spark.readStream.option("skipChangeCommits", "true").option("ignoreDeletes", "true").table("operations.api_logs.request_params")

 

yegorski
New Contributor III

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now