DLT Direct Publish Mode does not Handle Constraint Dependencies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2025 09:38 AM
I'm having some issues with the direct publish mode when defining a DLT workflow that includes tables where their schema defines foreign key constraints. When the foreign constraints reference tables that are not directly defined in any joins of the table, it seems to not recognize it as a dependency and gives me an error stating the table or view does not exist:
```
org.apache.spark.sql.catalyst.analysis.NoSuchTableException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `sample_catalog`.`sales_gold`.`user` cannot be found. Verify the spelling and correctness of the schema and catalog. If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog. To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01
```
I'm trying to use these constraints purely for informational/metadata purposes. Based on my reading, enforcement is not done with DLT materialized views which is fine for my use case. This seems to have worked a couple of weeks ago, but now it seems to be having a problem identifying this dependency, and it's trying to enforce the table's existence when they are running in parallel.
Any ideas on how I can approach this? The information is very nice to have especially with the ERD that it shows in the interface. I've tried the 'NOT ENFORCED' statement with the constraints to no avail.
- Labels:
-
Workflows