I know how to validate the column level constraint, like checking whether the specified column value is larger than target value.
Can I validate some table level constraints? For example, validate whether the total records count of a table is larger than the specified number. Or validate the foreign key between two tables.
I tried this way but it is now working:
@Dlt.expect("valid records count", "count('*') > 7500000")
I also tried to do `expect` inside the DLT pipeline function. The function can run successfully but I cannot find the result in the quality tab or the in event logs
@Dlt.table
def bronze_table():
df = spark.read.table("samples.tpch.orders")
dlt.expect("valid records count", df.count() > 7500000)
return df
Can I achieve this with dlt.expect?