I'm using unity catalog
I've changed the schema of my table by overwriting it with a newer file
df.write \
.format('delta') \
.partitionBy(partitionColumn) \
.mode("overwrite") \
.option("overwriteSchema", "true") \
.save(destinationPath)
queries to information_schema.columns within the catalog and from system.information_schema.columns both return the previous schema not the current one
The databricks gui catalog browser also shows the wrong columns
This is bad as many third party data viz and integration tools and code rely on querying information schema
How can i force databricks to show the correct metadata