- 3228 Views
- 2 replies
- 0 kudos
My Code:-- CREATE OR REPLACE TEMPORARY VIEW preprocessed_source ASSELECT Key_ID, Distributor_ID, Customer_ID, Customer_Name, ChannelFROM integr_masterdata.Customer_Master;-- Step 2: Perform the merge operation using the preprocessed source table...
- 3228 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Prashant Joshi Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us s...
1 More Replies
by
Mr__D
• New Contributor II
- 16563 Views
- 1 replies
- 0 kudos
Hello All,Could any one please suggest what is the best way to populate(Upsert) data from delta table into the sql server table.we are transforming our data in Databricks and storing data into the delta table. but for reporting purpose we need to pop...
- 16563 Views
- 1 replies
- 0 kudos
Latest Reply
@Deepak Bhatt :Yes, using the Spark Synapse connector could be a good option for upserting data from a Delta table into a SQL Server table. The Spark Synapse connector allows you to read and write data from Azure Synapse Analytics, formerly known as...
- 2590 Views
- 3 replies
- 1 kudos
Hi guys,I have a question about upsert/merge ... What do you do when que origin NOT exists, but you need to change status in the targetFor exemple:01/03 : source dataset [ id =1 and status = Active] ; target table [*not exists*] >> in this time the ...
- 2590 Views
- 3 replies
- 1 kudos
Latest Reply
Hello @William Scardua , Just adding to what @Vigneshraja Palaniraj replied.Reference: https://docs.databricks.com/sql/language-manual/delta-merge-into.htmlThanks & Regards,Nandini
2 More Replies
- 13654 Views
- 5 replies
- 0 kudos
This is the error which is coming while processing concurrent merge in delta lake tables in Azure Databricks .ConcurrentAppendException: Files were added to the root of the table by a concurrent update. Please try the operation again.. What are the o...
- 13654 Views
- 5 replies
- 0 kudos
Latest Reply
Hi @Abhishek Dutta Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
4 More Replies
by
Netty
• New Contributor III
- 5432 Views
- 5 replies
- 7 kudos
I had been trying to upsert rows into a table in Azure Blob Storage (ADLS Gen 2) based on two partitions (sample code below). insert overwrite table new_clicks_table partition(client_id, mm_date)
select
click_id
,user_id
,click_timestamp_gmt
,ca...
- 5432 Views
- 5 replies
- 7 kudos
Latest Reply
Below code might help youPython-
(df.write
.mode("overwrite")
.option("partitionOverwriteMode", "dynamic")
.saveAsTable("default.people10m")
)
SQL-
SET spark.sql.sources.partitionOverwriteMode=dynamic;
INSERT OVERWRITE TABLE default.people10m...
4 More Replies
- 19277 Views
- 4 replies
- 5 kudos
I have a table `demo_table_one` in which I want to upsert the following valuesdata = [
(11111 , 'CA', '2020-01-26'),
(11111 , 'CA', '2020-02-26'),
(88888 , 'CA', '2020-06-10'),
(88888 , 'CA', '2020-05-10'),
(88888 , 'WA', '2020-07-10'),
...
- 19277 Views
- 4 replies
- 5 kudos
Latest Reply
@John Constantine, can you additionally share what data is in demo_table_one? as we have only df (alias update_table) in that example
3 More Replies