Concurrent issue on delta lake insert update
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2024 07:10 AM
Hey team! I need your help on delta lake let me explain the scenario of mine.
Scenario: ive a table in delta lake and ive 2 databricks workflows running parallely which has insert and update tasks to do.
My delta table is partitioned with country code
My code for insert is
Df.write.mode(“append”).saveAsTable(table_name)
For update using merge
deltaTable.as("t").merge(
Df.as("s"),
"s.user_id = t.user_id")
.whenMatched().update(“t.col”:”s.col”….)
Im getting concurrent error when my job running parallely.
#databricks #deltalake
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2024 09:24 AM
Hi, @Dhanushn
In response to your question, the community contains the following information:
Additionally, you may have already seen it, but here’s a link to the relevant section in the official documentation.
https://docs.databricks.com/en/optimizations/isolation-level.html#concurrentappendexception
I hope this information is helpful to you!
Takuya Omi (尾美拓哉)

