- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2024 08:36 PM
I'm using unity catalog
I've changed the schema of my table by overwriting it with a newer file
df.write \
.format('delta') \
.partitionBy(partitionColumn) \
.mode("overwrite") \
.option("overwriteSchema", "true") \
.save(destinationPath)
queries to information_schema.columns within the catalog and from system.information_schema.columns both return the previous schema not the current one
The databricks gui catalog browser also shows the wrong columns
This is bad as many third party data viz and integration tools and code rely on querying information schema
How can i force databricks to show the correct metadata
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2024 11:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2024 11:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2024 09:03 PM
Thanks Daniel, that worked
i also had a workaround going with an alter table set tblproperties (optimizewrite = true) which I have set as a default anyway. But just touching the table triggered a metadata refresh
i prefer your command tho as that is its primary purpose

