Reading around 20 text files from ADLS, doing some transformations, and after that these files are written back to ADLS as a single delta file (all operations are in parallel through the thread pool). Here from 20 threads, it is writing to a single file, using dataframe.append method. and getting the below error,
"Exception type : MetadataChangedException Exception message : The metadata of the Delta table has been changed by a concurrent update. Please try the operation again. ,operation:WRITE,operationParameters:{mode:Append,partitionBy:[]} readVersion:0,isolationLevel:WriteSerializable,isBlindAppend:true,operationMetrics:{numFiles:20,numOutputRows:1650135,numOutputBytes:80649658}} Refer to https://docs.microsoft.com/azure/databricks/delta/concurrency-control for more details"
This is an INSERT operation, hence should be treated as blind INSERTS and the default isolation level, then why I am getting the concurrent update/metadata change exception here? Any idea ?