I'm copying data from a foreign catalog using a replace where logic in the target table, this work fine for two other tables. But for a specific one, I keep getting this error:
Table does not support overwrite by expression: DeltaTableV2(org.apache.spark.sql.SparkSession@...,abfss://...@....dfs.core.windows.net/__unitystorage/catalogs/.../tables/...,Some(CatalogTable( Catalog: ...Database: ...Table: ...Owner: c... Created Time: ... Last Access: UNKNOWN Created By: Spark Type: MANAGED Provider: delta Table Properties: [delta.checkpoint.writeStatsAsJson=false, delta.checkpoint.writeStatsAsStruct=true, delta.columnMapping.maxColumnId=91, delta.columnMapping.mode=name, delta.enableChangeDataFeed=false, delta.lastCommitTimestamp=1700647538000, delta.lastUpdateVersion=0, delta.minReaderVersion=2, delta.minWriterVersion=5]
Anybody has seen this before? I tried deleting the table, creating using another name, nothing works. Seems there either is some issue with field names or maybe something in the data? The table copies fine if I don't use the replace where logic in the query.
fails
insert into mycatalog.mydb.mytable
replace where myfield >= {{ pipeline param }}
select * from myforeigncatalog.mydb.mytable;
works
insert into mycatalog.mydb.mytable
select * from myforeigncatalog.mydb.mytable;