cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error: The associated location ... is not empty but it's not a Delta table

Chris_Konsur
New Contributor III

I try to create a table but I get this error: AnalysisException: Cannot create table ('`spark_catalog`.`default`.`citation_all_tenants`'). The associated location ('dbfs:/user/hive/warehouse/citation_all_tenants') is not empty but it's not a Delta table

I checked the table location dbutils.fs.ls('dbfs:/user/hive/warehouse/') and there is the table with size= 0.

FileInfo(path='dbfs:/user/hive/warehouse/citation_all_tenants/', name='citation_all_tenants/', size=0, modificationTime=1678202578000),

I tried to delete table using this command dbutils.fs.rm('dbfs:/user/hive/warehouse/citation_all_tenants',recurse=True) but it does not remove it. My question is how to how to delete and recreate the table?

1 ACCEPTED SOLUTION

Accepted Solutions

Lakshay
Databricks Employee
Databricks Employee

Hi @Chris Konsur​ , This issue generally happens when using "DROP TABLE" and "CREATE TABLE" commands. Databricks recommends to use "CREATE or REPLACE" commands to overwrite a delta table rather than dropping and recreating the table.

Also, to fix the issue you need to delete the table directory. You can try below methods for the same:

  1. If the table definition exits, you can drop the previous table using Drop table command.
  2. If table definition do not exist, please remove the table directory using "rm -r" command

View solution in original post

4 REPLIES 4

pvignesh92
Honored Contributor

@Chris Konsur​ Hi. You did not mention the command you used to create the table. However the issue I could see here is that the table is already available in the catalog and you have to create a new table with the same name. If that's the case, the below two approaches could work.

  1. Drop the existing table using DROP TABLE command and then create your new table so that the old entry can be cleared from the Catalog.
  2. Using CREATE OR REPLACE TABLE command.

Deleting the filesystem path will not remove your table entry from the catalog but dropping the table will remove your file path if it is a managed table. I see your table could be a managed table as the source path is hive warehouse path. Please try and let us know if this works.

Lakshay
Databricks Employee
Databricks Employee

Hi @Chris Konsur​ , This issue generally happens when using "DROP TABLE" and "CREATE TABLE" commands. Databricks recommends to use "CREATE or REPLACE" commands to overwrite a delta table rather than dropping and recreating the table.

Also, to fix the issue you need to delete the table directory. You can try below methods for the same:

  1. If the table definition exits, you can drop the previous table using Drop table command.
  2. If table definition do not exist, please remove the table directory using "rm -r" command

Anonymous
Not applicable

Hi @Chris Konsur​ 

Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.

Please help us select the best solution by clicking on "Select As Best" if it does.

Your feedback will help us ensure that we are providing the best possible service to you.

Thank you!

sachin_tirth
New Contributor II

Hi Team, I am facing the same issue. When we try to load data to table in production batch getting error as table not in delta format. there is no recent change in table. and we are not trying any create or replace table. this is existing table in prod. this happened multiple times. we had to drop the table and clean the ABFS path to recreate. but this is not good approach as we might lose historical data. can some body guide why this error occurs when table is already in delta format. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group