cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DLT Expectations Alert for Warning

ajgold
New Contributor II

I want to receive an alert via email or Slack when the @Dlt.expect declaration fails the validation check in my DLT pipeline. I only see the option to add an email alert for @Dlt.expect_or_fail failures, but not for warnings.

6 REPLIES 6

RiyazAliM
Honored Contributor

Hey @ajgold 
I don't think DLT has this feature yet. You may raise a feature request for Databricks to add it in its future releases over here - https://databricks.aha.io/
Cheers!

Riz

ajgold
New Contributor II

Hey @RiyazAliM thanks so much for your response! Do you know of any workarounds for now? Like a system table that has a record of a job run that resulted in a warning? 

szymon_dybczak
Esteemed Contributor III

Hi @ajgold ,

I think I have some idea for workaround. Lakeflow Declarative Pipelines (former DLT) generate event log that contains all information related to pipeline like:

- audit logs

- data quality checks (this is what you're looking for)

- progress

- data lineage

Lakeflow Declarative Pipelines writes the event log to a hidden Delta table in the default catalog and schema configured for the pipeline. While hidden, the table can still be queried by all sufficiently privileged users. By default, only the owner of the pipeline can query the event log table. 

You can read how to query event_log here:

Monitor Lakeflow Declarative Pipelines - Azure Databricks | Microsoft Learn

And you can read how to query data quality from the event log here:

Monitor Lakeflow Declarative Pipelines - Azure Databricks | Microsoft Learn

 

Now, you can configure custom monitoring for lakeflow declartive pipelines (former DLT) with event hooks. This allow you to send notification to your Slack, Teams etc. Here's a relevant documentation entry to read.

So to sum it up, assuming that event_log contains the information you're looking for you can try to define event hook for that type of event and send message to your client of choice.

Define custom monitoring of Lakeflow Declarative Pipelines with event hooks - Azure Databricks | Mic...

 

This is brilliant idea @szymon_dybczak ! One small correction, now event logs are persisted to the UC and could be queried by users with grants on that table. Thank you!

Riz

szymon_dybczak
Esteemed Contributor III

Oh, thanks @RiyazAliM  for sharing information. Everyday I learn something new here ๐Ÿ˜„

you're awesome thank you!! checking it out today