cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT Apply Changes Tables corrupt

MatthewMills
New Contributor III

Got a weird DLT error.

Test harness using the new(ish) 'Apply Changes from Snapshot' Functionality and DLT Serverless (Current Channel). Azure Aus East Region.

Has been working for several months without issue - but within the last week these DLT tables have started failing with an error:

 
Your request failed with status FAILED: [BAD_REQUEST] Table 'mycatalog.myschema.mytable' is invalid because one of the underlying resources does not exist. Table '__databricks_internal.__dlt_materialization_schema_##################.__materialization_mat_mytable_1' does not exist.
 
1) DLT does not need to be running for the tables to stop working. Within a few hours of the pipeline running the tables in the job all stop working with the above error.
2) If the broken tables are manually deleted via the UC browser - DLT undrops the tables and recreates them correctly. A few hours later they break.
3) This happens across multiple environments within the tenancy.
 
1 ACCEPTED SOLUTION

Accepted Solutions

Lakshay
Databricks Employee
Databricks Employee

We have an open ticket on this issue. The issue is caused by the maintenance pipeline renaming the backing table. We expect the fix to be rolled out soon for this issue.

View solution in original post

3 REPLIES 3

Pelle123
New Contributor II

I'm experiencing the same issue.

mjbobak
Contributor

We have the same error. It does not seem to be related to the Current or Preview runtimes in the DLT settings. 

region: Azure East US 2

For us, the pipeline completes successfully but the corresponding connection to the underlying data referenced from the catalog is missing. Further, this appears to be for apply_changes_in_snapshot() API into a streaming table but tables and data appear fine for standard streaming tables and materialized views. 

Lakshay
Databricks Employee
Databricks Employee

We have an open ticket on this issue. The issue is caused by the maintenance pipeline renaming the backing table. We expect the fix to be rolled out soon for this issue.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group