- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-15-2024 01:43 AM
Hello!
I created a DLT pipeline where my sources are external tables. I have to apply changes (stored_as_scd_type = 1). However, when I run my pipeline, I don't see any incremental uploads. The data remains in the same state as when I first created the pipeline. Do you have any suggestions?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-16-2024 12:40 AM
Hello @lucasrocha ,
Yes, we are receiving new records in our tables and updated_at is always populated. We are using the same script to read data from foreign catalog and that DLT pipeline is working completely fine
dlt.apply_changes(
target = target_table,
source = source_table,
keys = ["id"],
sequence_by = col("updated_at"),
stored_as_scd_type = "1"
)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-15-2024 01:13 PM
Hello @ksenija,
I hope this message finds you well.
Is your source table receiving new records? If so, are the fields (operation/sequenceNum) being filled?
If possible, please provide a sample of the code you are using to create your target table with apply changes
.
Best regards,
Lucas Rocha
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-16-2024 12:40 AM
Hello @lucasrocha ,
Yes, we are receiving new records in our tables and updated_at is always populated. We are using the same script to read data from foreign catalog and that DLT pipeline is working completely fine
dlt.apply_changes(
target = target_table,
source = source_table,
keys = ["id"],
sequence_by = col("updated_at"),
stored_as_scd_type = "1"
)

