Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-01-2022 06:58 PM
I have a table on which I do upsert i.e.
MERGE INTO table_name ...
After which I run
OPTIMIZE table_name
Which throws an error
java.util.concurrent.ExecutionException: io.delta.exceptions.ConcurrentDeleteReadException: This transaction attempted to read one or more files that were deleted (for example part-00000-50e8fcea-1314-445b-a4fd-a7b61a9bf02c-c000.snappy.parquet in the root of the table) by a concurrent update. Please try the operation again.
Not sure, what's happening as I am not deleting any files. Is there a way to fix this?
Labels:
1 ACCEPTED SOLUTION
Accepted Solutions
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2022 06:37 AM
- You can try to change isolation level:
https://docs.microsoft.com/en-us/azure/databricks/delta/optimizations/isolation-level
- In merge is good to specify all partitions in merge conditions.
- It can also happen when script is running concurrently.
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2022 06:37 AM
- You can try to change isolation level:
https://docs.microsoft.com/en-us/azure/databricks/delta/optimizations/isolation-level
- In merge is good to specify all partitions in merge conditions.
- It can also happen when script is running concurrently.