- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2025 10:08 AM
This is a tricky situation where you want to leverage the metadata benefits (like the ERD visualization) without running into execution dependencies. Let me help you solve this issue.
The error suggests that DLT is trying to validate the foreign key relationships during pipeline execution, even though you want these constraints purely for metadata/documentation purposes. The NOT ENFORCED clause should work in theory, but it seems there are some limitations in the current implementation.
Solutions to Try
1. Use Development Mode First
spark.conf.set("spark.databricks.delta.properties.defaults.enableChangeDataFeed", "true")
2. Explicit Table Dependencies
Explicitly declare dependencies in your DLT definitions:
# Python
import dlt
from pyspark.sql.functions import *
@dlt.table(
name="orders",
comment="Orders table with FK to users",
table_properties={
"quality": "gold"
},
# Explicitly state dependencies
dependencies=["user"]
)
def orders():
# Your transformation logic
return spark.table("source_orders")
3. Use Schema Evolution Mode
Set your tables to use schema evolution mode, which can sometimes help with these types of dependency issues:
CREATE OR REFRESH LIVE TABLE orders
SCHEMA_EVOLUTION = RESCUE
COMMENT "Orders table with FK to users"
-- Rest of your definition