Hey team! I need your help on delta lake let me explain the scenario of mine.
Scenario: ive a table in delta lake and ive 2 databricks workflows running parallely which has insert and update tasks to do.
My delta table is partitioned with country code
My code for insert is
Df.write.mode(“append”).saveAsTable(table_name)
For update using merge
deltaTable.as("t").merge(
Df.as("s"),
"s.user_id = t.user_id")
.whenMatched().update(“t.col”:”s.col”….)
Im getting concurrent error when my job running parallely.
#databricks #deltalake