Handling Concurrent Writes to a Delta Table by delta-rs and Databricks Spark Job

prem14f
New Contributor II

Hi @dennyglee@Retired_mod.

If I am writing data into a Delta table using delta-rs and a Databricks job, but I lose some transactions, how can I handle this?

Given that Databricks runs a commit service and delta-rs uses DynamoDB for transaction logs, how can we handle concurrent writers from Databricks jobs and delta-rs writers on the same table?