cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Delta Live Table : [TABLE_OR_VIEW_ALREADY_EXISTS] Cannot create table or view

SyedSaqib
New Contributor II

Hi,

I have a delta live table workflow with storage enabled for cloud storage to a blob store.

Syntax of bronze table in notebook
===

@dlt.table(
spark_conf = {"spark.databricks.delta.schema.autoMerge.enabled": "true"},
table_properties = {
"quality": "bronze"
}
)
def sap_mdo_sfc_bronze():
return (
spark.readStream
.schema(schema) \
.format("cloudFiles") \
.option("cloudFiles.format", "json") \
.option("cloudFiles.inferColumnTypes", True) \
.option("multiline","true") \
.option("header", "True") \
.option("cloudFiles.schemaLocation", data_path+"/SCHEMA") \
.load(data_path+"/DATA") \
.select("*")
)


===

Once delta live table runs it creates tables in blob storage and also with metadata in the hivemetastore under a specified schema.
Issue: When I start or run the pipeline update for the second time it failed with below error
====
org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException: [TABLE_OR_VIEW_ALREADY_EXISTS] Cannot create table or view `tenant_id`.`table_bronze` because it already exists. Choose a different name, drop or replace the existing object, add the IF NOT EXISTS clause to tolerate pre-existing objects, or add the OR REFRESH clause to refresh the existing streaming table.
====

As a work around, first I delete the table from hivemetastore and then I Start pipeline update. Then it runs successfully.

Can anyone help me understand this issue.

Thanks and regards,
Syed Saqib

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @SyedSaqib

  • Rename the table you’re trying to create to a different name that doesn’t conflict with existing tables.
  • If you’re intentionally replacing the existing table, you can drop it first and then create the new one.
  • Be cautious with this approach, as it permanently removes the existing data.
  • When creating the table, add the IF NOT EXISTS clause to tolerate pre-existing objects.
  • This way, if the table already exists, the creation process won’t fail.
  • If you’re updating an existing streaming table, consider using the OR REFRESH clause.
  • Your workaround of manually deleting the table from the Hive Metastore before starting the pipeline update is valid.
  • However, it’s essential to understand why the table is persisting between runs and address the root cause.
  • To understand why the table persists, consider the following:
    • Is the table being created outside of your pipeline (e.g., by another process)?
    • Are there multiple instances of the pipeline running concurrently?
    • Is the table being cached or retained unintentionally?
  • If you have any specific details about your pipeline configuration or additional context, feel free to share, and we can dive deeper! 😊
  •  

SyedSaqib
New Contributor II

Hi Kaniz,

Thanks for replying back.
I am using python for delta live table creation, so how can I set these configurations?

  • When creating the table, add the IF NOT EXISTS clause to tolerate pre-existing objects.
  • consider using the OR REFRESH clause

 

Answering to your question:

  • Is the table being created outside of your pipeline (e.g., by another process)? --> No
  • Are there multiple instances of the pipeline running concurrently? --> No
  • Is the table being cached or retained unintentionally? --> No

In the description, the configuration you see are the only configurations.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!