Handling Concurrent Writes to a Delta Table by delta-rs and Databricks Spark Job
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-30-2024 03:06 AM
Hi @dennyglee, @Retired_mod.
If I am writing data into a Delta table using delta-rs and a Databricks job, but I lose some transactions, how can I handle this?
Given that Databricks runs a commit service and delta-rs uses DynamoDB for transaction logs, how can we handle concurrent writers from Databricks jobs and delta-rs writers on the same table?