Hi All,
I have been facing an issue with few of my DLT pipelines.
source code:
CREATE OR REFRESH STREAMING TABLE ****
TBLPROPERTIES(
"enableChangeDataFeed"= "true",
"delta.autoOptimize.optimizeWrite"="true"
)
AS SELECT *,
_metadata.file_path as filepath,
current_timestamp as load_date_time
FROM cloud_files("abfss://*********@**********.dfs.core.windows.net/*******/******", "csv", map("inferSchema", "true", "mergeSchema", "true"))
I run this pipeline on a daily basis which uses autoloader to refresh my streaming table, it is reading in parquet format.
However, this pipeline throws an error:
Update 2c50a6 has failed while trying to update table 'cm-dev-upark-adb-001.bronze.__materialization_mat_*******************************.
com.databricks.sql.managedcatalog.UnityCatalogServiceException: [RequestId=***** ErrorClass=TABLE_ALREADY_EXISTS.RESOURCE_ALREADY_EXISTS] Table '*******' already exists
any ideas as to what might be the issue? I have a few dlt pipeline which use autoloader to query files from the same container withtin that storage account but some run fine and some throw this error.