โ10-11-2024 05:10 AM
I am just working with Databricks, and have come across an issue where delta tables have been created in the catalogue but do not actually exist. see screenshot for script ive been running and error messages
is this a bug, or am i missing something obvious here.
Surely i cannot be this bad LOL
any help would be appreciated.
Regards Rob
โ10-21-2024 01:50 AM
Ensure you are specifying the database (or schema) where the table should be created.
(If you are using the default database, you can omit the database name.)
โ10-11-2024 07:46 AM
Hi @RobDineen ,
Maybe you tried to drop the table, but due to some corruption some files remained in the location.
Could you check if any files exist in the table location?
display(dbutils.fs.ls("dbfs:/user/hive/warehouse/season"))
If any files exists, just delete them and try to recreate the table.
โ10-20-2024 07:36 AM - edited โ10-20-2024 07:58 AM
Hi there, sorry taken a while to get back, been on holiday
i have tried the above suggestion and i get the following 2 rows back
surely that is just telling us there is one DB called season
also if i run the following, gives me a list of all DBs
but why does the catalogue have 3 objects with the same name ?
also if you drop one of the objects, i also get the following
so where are the 3 tables in the list coming from ?
โ10-20-2024 08:53 AM
Hi @RobDineen ,
Based on your printscreens, it seems that there are residual files in the storage location that are causing the tables to appear in the catalog even though they don't exist properly. This can happen if a table was not dropped cleanly or if there was some corruption during deletion.
Could you remove the files and folders of the season table location?
table_path = "dbfs:/user/hive/warehouse/season"
dbutils.fs.rm(table_path, recurse=True)
Could you check if removing the files help and the tables do not appear in the catalog + you are able to run create table statement?
โ10-21-2024 01:50 AM
Ensure you are specifying the database (or schema) where the table should be created.
(If you are using the default database, you can omit the database name.)
โ10-21-2024 05:09 AM - edited โ10-21-2024 05:10 AM
@RobDineenCan you try to refresh your table (REFRESH TABLE your_catalog.your_schema.your_table) and followed by spark.catalog.clearCache().
Then try the drop operation:
table_path = "dbfs:/user/hive/warehouse/season"
dbutils.fs.rm(table_path, recurse=True)
โ10-22-2024 04:00 AM
Thank you for your reply, i run the clearCache() but did not make any difference. However
I deleted all my code within my notebook, and as i was clearing out the code, the tables disappeared one by one, as if the (For you) tab was working like it a history or what will end up being the result ???
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group