- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-02-2023 06:52 AM
Hi All,
I have a scenario where my Exisiting Delta Table looks like below:
Now I have an incremental data with an additional column i.e. owner:
Dataframe Name --> scdDFBelow is the code snippet to merge Incremental Dataframe to targetTable, but the new column is not getting added:
spark.conf.set("spark.databricks.delta.schema.autoMerge.enabled",True) --> this is also enabled.
But still no luck. Below is the final result which i'm currently getting:
Data looks correct, but the only issue is New Column i.e. Owner is still not merged in targetTable.
Someone please help.
Thanks!
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-03-2023 05:49 PM
Just add that column to the table with an ALTER TABLE statement. You should capture each table/dataframe columns as a list with df.columns, compare them, and if the table is missing anything do the alter table. Then run your code.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-03-2023 05:49 PM
Just add that column to the table with an ALTER TABLE statement. You should capture each table/dataframe columns as a list with df.columns, compare them, and if the table is missing anything do the alter table. Then run your code.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-06-2023 12:02 AM
Hi @Divyansh Jainโ
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-16-2023 02:29 AM
This is resolved. Thanks, everyone!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-16-2023 02:34 AM
@Vidula Khannaโ Enabling the below property resolved my issue:
spark.conf.set("spark.databricks.delta.schema.autoMerge.enabled",True)
Thanks v much!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ10-25-2023 03:34 AM
sorry, can you please explain what was the problem? cause you have already that property enabled
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-26-2024 08:51 PM
In Databricks Runtime 15.2 and above, you can specify schema evolution in a merge statement using SQL or Delta table APIs:
MERGE WITH SCHEMA EVOLUTION INTO target
USING source
ON source.key = target.key
WHEN MATCHED THEN
UPDATE SET *
WHEN NOT MATCHED THEN
INSERT *
WHEN NOT MATCHED BY SOURCE THEN
DELETE

