โ10-11-2024 05:10 AM
I am just working with Databricks, and have come across an issue where delta tables have been created in the catalogue but do not actually exist. see screenshot for script ive been running and error messages
is this a bug, or am i missing something obvious here.
Surely i cannot be this bad LOL
any help would be appreciated.
Regards Rob
4 weeks ago
Ensure you are specifying the database (or schema) where the table should be created.
(If you are using the default database, you can omit the database name.)
โ10-11-2024 07:46 AM
Hi @RobDineen ,
Maybe you tried to drop the table, but due to some corruption some files remained in the location.
Could you check if any files exist in the table location?
display(dbutils.fs.ls("dbfs:/user/hive/warehouse/season"))
If any files exists, just delete them and try to recreate the table.
4 weeks ago - last edited 4 weeks ago
Hi there, sorry taken a while to get back, been on holiday
i have tried the above suggestion and i get the following 2 rows back
surely that is just telling us there is one DB called season
also if i run the following, gives me a list of all DBs
but why does the catalogue have 3 objects with the same name ?
also if you drop one of the objects, i also get the following
so where are the 3 tables in the list coming from ?
4 weeks ago
Hi @RobDineen ,
Based on your printscreens, it seems that there are residual files in the storage location that are causing the tables to appear in the catalog even though they don't exist properly. This can happen if a table was not dropped cleanly or if there was some corruption during deletion.
Could you remove the files and folders of the season table location?
table_path = "dbfs:/user/hive/warehouse/season"
dbutils.fs.rm(table_path, recurse=True)
Could you check if removing the files help and the tables do not appear in the catalog + you are able to run create table statement?
4 weeks ago
Ensure you are specifying the database (or schema) where the table should be created.
(If you are using the default database, you can omit the database name.)
4 weeks ago - last edited 4 weeks ago
@RobDineenCan you try to refresh your table (REFRESH TABLE your_catalog.your_schema.your_table) and followed by spark.catalog.clearCache().
Then try the drop operation:
table_path = "dbfs:/user/hive/warehouse/season"
dbutils.fs.rm(table_path, recurse=True)
4 weeks ago
Thank you for your reply, i run the clearCache() but did not make any difference. However
I deleted all my code within my notebook, and as i was clearing out the code, the tables disappeared one by one, as if the (For you) tab was working like it a history or what will end up being the result ???
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group