cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Delta table in catalogue are showing but DO NOT exist

RobDineen
New Contributor III

I am just working with Databricks, and have come across an issue where delta tables have been created in the catalogue but do not actually exist.  see screenshot for script ive been running and error messages

RobDineen_0-1728648443009.png

is this a bug, or am i missing something obvious here.  
Surely i cannot be this bad LOL

any help would be appreciated.

Regards Rob

1 ACCEPTED SOLUTION

Accepted Solutions

saurabh18cs
Contributor II

Ensure you are specifying the database (or schema) where the table should be created.

(If you are using the default database, you can omit the database name.)

 

database_name = "your_database_name"  # Replace with your actual database name
table_name = f"{database_name}.season"
target_df.write.mode("overwrite").saveAsTable(table_name)

View solution in original post

6 REPLIES 6

filipniziol
Contributor

Hi @RobDineen ,

Maybe you tried to drop the table, but due to some corruption some files remained in the location.

Could you check if any files exist in the table location?

display(dbutils.fs.ls("dbfs:/user/hive/warehouse/season"))

If any files exists, just delete them and try to recreate the table.


Hi there, sorry taken a while to get back, been on holiday

i have tried the above suggestion and i get the following 2 rows back 

RobDineen_0-1729433888450.png

surely that is just telling us there is one DB called season

also if i run the following, gives me a list of all DBs

RobDineen_0-1729434896272.png

but why does the catalogue have 3 objects with the same name ?

RobDineen_1-1729434955680.png

also if you drop one of the objects, i also get the following

RobDineen_0-1729436296447.png



so where are the 3 tables in the list coming from ?

filipniziol
Contributor

Hi @RobDineen ,

Based on your printscreens, it seems that there are residual files in the storage location that are causing the tables to appear in the catalog even though they don't exist properly. This can happen if a table was not dropped cleanly or if there was some corruption during deletion.

Could you remove the files and folders of the season table location?

table_path = "dbfs:/user/hive/warehouse/season"
dbutils.fs.rm(table_path, recurse=True)

Could you check if removing the files help and the tables do not appear in the catalog + you are able to run create table statement?




saurabh18cs
Contributor II

Ensure you are specifying the database (or schema) where the table should be created.

(If you are using the default database, you can omit the database name.)

 

database_name = "your_database_name"  # Replace with your actual database name
table_name = f"{database_name}.season"
target_df.write.mode("overwrite").saveAsTable(table_name)

Panda
Valued Contributor

@RobDineenCan you try to refresh your table (REFRESH TABLE your_catalog.your_schema.your_table) and followed by spark.catalog.clearCache().

Then try the drop operation:

 

table_path = "dbfs:/user/hive/warehouse/season"
dbutils.fs.rm(table_path, recurse=True)

 

RobDineen
New Contributor III

Thank you for your reply, i run the clearCache() but did not make any difference. However

I deleted all my code within my notebook, and as i was clearing out the code, the tables disappeared one by one, as if the (For you) tab was working like it a history or what will end up being the result ???

RobDineen_0-1729594806150.png

 



Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group