UPDATE or DELETE with pipeline that needs to reprocess in DLT
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-09-2023 06:07 AM
i'm currently trying to replicate a existing pipeline that uses standard RDBMS. No experience in DataBricks at all
I have about 4-5 tables (much like dimensions) with different events types and I want to my pipeline output a streaming table as final output to facilitate processing in another next pipeline.
My problem is that one of the tables defines the campaign of the event and each other table is events relative to that campaign. And unfortunately the way the data is uploaded to cloud the data isn't syncronized soo we can have events that aren't defined already.
The way I designed the pipeline is by segregating all the still not defined data in a table, and by each update, i'll join the segregated data again, after that I union with non segregated and tries to apply_change in target table.
The problem: after this new defining data arrives, it'll consider a update rather then a insert on the source resulting in a error.
There's any way to write this new data as a new data rather than a update on source? I don't want the pipeline to reprocess everything since the data is quite considerably
- Labels:
-
Delta Lake
-
Spark
-
Workflows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-02-2023 06:01 AM - edited 11-02-2023 06:01 AM
Hi @dowdark Just a friendly follow-up. Have you had the opportunity to go through my colleague's response to your inquiry? Was it beneficial, or do you still require further assistance? Your response would be highly valued.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-10-2023 12:54 AM
Hi @dowdark,
What is the error that you get when the pipeline tries to update the rows instead of performing an insert? That should give us more info about the problem
Please raise an SF case with us with this error and its complete stack trace.

