cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

information_schema not populating with columns

KIRKQUINBAR
New Contributor III

We started migrating databases from hive_metastore into unity catalog back in October 2024 and ive noticed that periodically the Catalog UI will not show columns or a data preview for some tables, but not all of them that were migrated. After some digging i realized that some tables, the columns are not being populated into the system information_schema columns table which may explain the UI issue.

I was able to recreate the problem using some simple queries. This happens on an existing databricks instance we have had running for years where i added unity catalog on top of. If I create a brand new databricks instance, the columns being populated in system information_schema does not occur. Any one experience this issue. The table itself does show up in information_schema tables table.

select * from system.information_schema.tables where table_catalog='development' and table_schema='default' and table_name='test_base';
select * from system.information_schema.columns where table_catalog='development' and table_schema='default' and table_name='test_base';

select * from system.information_schema.tables where table_catalog='development' and table_schema='default' and table_name='test_copy';
select * from system.information_schema.columns where table_catalog='development' and table_schema='default' and table_name='test_copy';

-- run these together and table columns WILL NOT populate
drop table if exists development.default.test_base;
create table development.default.test_base (
    id int,
    display_name string
);

-- run this without dropping table and table columns WILL populate
-- if table is dropped first then table columns WILL NOT populate
create or replace table development.default.test_base (
    id int,
    display_name string
);

-- run these together and table columns WILL NOT populate
drop table if exists development.default.test_copy;
CREATE TABLE development.default.test_copy
   DEEP CLONE development.default.test_base;

-- run this without dropping table and table columns WILL populate
-- if table is dropped first then table columns WILL NOT populate
CREATE OR REPLACE TABLE development.default.test_copy
   DEEP CLONE development.default.test_base;   

 This seems like a bug or setting that didnt get set when i converted this existing databricks instance to use unity catalog. 

1 ACCEPTED SOLUTION

Accepted Solutions

KIRKQUINBAR
New Contributor III

this is definitely a bug related to older instances of azure databricks that were upgraded to use unity platform. after going back and forth with MS support for 2+ months, we made the decision to just spin up a new instance of azure databricks and connect to the same workspace. that solves the issue. the old instance has since been deprecated.

View solution in original post

2 REPLIES 2

intuz
Contributor II

Yes, I’ve seen this too after migrating from Hive Metastore to Unity Catalog on an older workspace.

The issue seems to happen when using DROP TABLE + CREATE TABLE — the table appears in information_schema.tables but not in information_schema.columns. However, if you use CREATE OR REPLACE TABLE, the columns show up correctly.

It looks like a metadata sync bug in Unity Catalog, especially in older workspaces where Unity was added later. On newer workspaces, this doesn’t happen.

Workaround: Use CREATE OR REPLACE instead of dropping and recreating the table. You can also try running DESCRIBE TABLE or MSCK REPAIR TABLE to force a metadata refresh.

I’d recommend reporting this to Databricks Support — likely a backend fix is needed to fully sync metadata after migration.

KIRKQUINBAR
New Contributor III

this is definitely a bug related to older instances of azure databricks that were upgraded to use unity platform. after going back and forth with MS support for 2+ months, we made the decision to just spin up a new instance of azure databricks and connect to the same workspace. that solves the issue. the old instance has since been deprecated.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now