MetadataChangedException Exception in databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2023 05:46 AM
Reading around 20 text files from ADLS, doing some transformations, and after that these files are written back to ADLS as a single delta file (all operations are in parallel through the thread pool). Here from 20 threads, it is writing to a single file, using dataframe.append method. and getting the below error,
"Exception type : MetadataChangedException Exception message : The metadata of the Delta table has been changed by a concurrent update. Please try the operation again. ,operation:WRITE,operationParameters:{mode:Append,partitionBy:[]} readVersion:0,isolationLevel:WriteSerializable,isBlindAppend:true,operationMetrics:{numFiles:20,numOutputRows:1650135,numOutputBytes:80649658}} Refer to https://docs.microsoft.com/azure/databricks/delta/concurrency-control for more details"
This is an INSERT operation, hence should be treated as blind INSERTS and the default isolation level, then why I am getting the concurrent update/metadata change exception here? Any idea ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2023 06:03 AM
Can you share a sample code?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2023 06:33 AM
See the code below
df.write.format("delta")\
.mode('append')\
.option("overwriteSchema",False)\
.option("mergeSchema",True)\
.save(/mnt/r/xyz.delta)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-30-2023 06:34 AM
I have seen this problem with Identity column causing concurrency issues. But you seem to be getting similar error when writing to files. I don't know completely know your use case completely here, but would advice retrying this operation by managing the MetadataException in a try except block. Let us know what solved the problem?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-20-2025 03:12 AM
How can we import the exception "MetadataChangedException"?
Or does Databricks recommend to catch / except Exception and parse the string?

