05-15-2023 12:09 AM
My code:- CREATE OR REPLACE TEMPORARY VIEW preprocessed_source AS
SELECT
Key_ID,
Distributor_ID,
Customer_ID,
Customer_Name,
Channel
FROM integr_masterdata.Customer_Master;
-- Step 2: Perform the merge operation using the preprocessed source table
MERGE INTO slvr_masterdata.Customer_Master as Target
USING preprocessed_source AS Source
ON
Source.Key_ID = Target.Key_ID
WHEN MATCHED THEN
UPDATE SET
Target.Distributor_ID = Source.Distributor_ID,
Target.Customer_ID = Source.Customer_ID,
Target.Customer_Name = Source.Customer_Name,
Target.Channel = Source.Channel,
Target.Time_Stamp = current_timestamp()
WHEN NOT MATCHED
THEN INSERT
(
Distributor_ID,
Customer_ID,
Customer_Name,
Channel,
Time_Stamp
)
VALUES (
Source.Distributor_ID,
Source.Customer_ID,
Source.Customer_Name,
Source.Channel,
current_timestamp()
)
05-15-2023 12:27 AM
you have duplicates in your incoming data according to the join condition (Key_Id in this case).
The way to handle this is to get rid of the dups before you do the merge.
05-15-2023 02:58 AM
Thankuu werners for your answer.. is i have to remove duplicate in same code?...can you provide me code?
05-15-2023 04:39 AM
first you have to find out what the cause of the duplicates is.
it might be that you try to join on an incomplete key. In that case you have to change your join condition.
Or perhaps you can just do a dropduplicates/distinct.
I never use sql btw to prepare data, imo you lose a lot of flexibility.
01-10-2024 06:56 AM
Hey as previously stated you could drop the duplicates of the columns that contain the said duplicates(code you can find online pretty easily), I have had this problem myself and it came when creating a temporary view from a dataframe, the dataframe didnt include duplicates but the tempview did hope this helps
05-13-2025 08:16 AM
The error occurs; when we try to update all the cells of target_data without a single updated record in source_data(updates_data) , to resolve this issue add a update_time column with current timestamp (or) make changes in at least one cell of streaming/batch/incremental data, so that the DeltaTable knows it's not a duplicate.
05-13-2025 08:54 AM - edited 05-13-2025 08:56 AM
This error occurs; when we try to update all the cells of target_data without a single updated record in source_data(updates_data) , to resolve this issue add a update_time column with unix timestamp (or) make changes in at least one cell of streaming/batch/incremental data, so that the DeltaTable knows it's not a duplicate.
In your scenario when you re-run the notebook with current timestamp it picks only in hours and days not in seconds and minutes which makes the whole data as duplicate, since your ran within an hour or less then 60 minutes.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now