cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to add webhook notification in DLT pipeline through yml

guptaharsh
New Contributor III

Hi team,

I am trying to create a slack webhook notification in a DLT pipeline leveraging jobs.yml. 

targets:
  dev:
    # The default target uses 'mode: development' to create a development copy.
    # - Deployed resources get prefixed with '[dev my_user_name]'
    # - Any job schedules and triggers are paused by default.
    mode: development
    default: true
    workspace:
    variables:
      catalog: data
      schema: sap_data
      notifications: email@jkmail.com]
      webhook: https://hooks.slack.com/services6CdQmlT

I am trying to save this yml and deploy but the slack webhook its not working.
#DLT #pipeline
2 ACCEPTED SOLUTIONS

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @guptaharsh ,

Out of the box, there's currently no way to configure notification via webhook. Only email is supported as a notification medium for pipeline. Check yourself at rest api docs:

Create a pipeline | Pipelines API | REST API reference | Databricks on AWS

szymon_dybczak_0-1756360309197.png

But you can use custom monitoring to achieve what you want. Here's an example from documentation where they sending event to a Slack channel using webhook

Define custom monitoring of Lakeflow Declarative Pipelines with event hooks | Databricks on AWS

 

szymon_dybczak_1-1756360426610.png

 

View solution in original post

guptaharsh
New Contributor III

@szymon_dybczak , I gone through the Databricks docs.. I have added the webhook notifications in DLT pipeline. 
it works fine for me 😊.

 

# The job that triggers api_data_pipeline.
resources:
  jobs:
    api_data_job:
      name: api_data_job
      schedule:
        quartz_cron_expression: 0 30 5 * * ?
        timezone_id: Asia/Kolkata
        pause_status: UNPAUSED

      webhook_notifications:
        on_failure:
          - id: ${var.webhook}
        on_success:
          - id: ${var.webhook}

 

View solution in original post

5 REPLIES 5

szymon_dybczak
Esteemed Contributor III

Hi @guptaharsh ,

Out of the box, there's currently no way to configure notification via webhook. Only email is supported as a notification medium for pipeline. Check yourself at rest api docs:

Create a pipeline | Pipelines API | REST API reference | Databricks on AWS

szymon_dybczak_0-1756360309197.png

But you can use custom monitoring to achieve what you want. Here's an example from documentation where they sending event to a Slack channel using webhook

Define custom monitoring of Lakeflow Declarative Pipelines with event hooks | Databricks on AWS

 

szymon_dybczak_1-1756360426610.png

 

guptaharsh
New Contributor III

@szymon_dybczak , thanks for the update. I thought like as we can add slack notification in jobs.

szymon_dybczak
Esteemed Contributor III

No problem @guptaharsh  🙂 And if the answer was helpful to you consider marking it as a solution to this thread

guptaharsh
New Contributor III

@szymon_dybczak , I gone through the Databricks docs.. I have added the webhook notifications in DLT pipeline. 
it works fine for me 😊.

 

# The job that triggers api_data_pipeline.
resources:
  jobs:
    api_data_job:
      name: api_data_job
      schedule:
        quartz_cron_expression: 0 30 5 * * ?
        timezone_id: Asia/Kolkata
        pause_status: UNPAUSED

      webhook_notifications:
        on_failure:
          - id: ${var.webhook}
        on_success:
          - id: ${var.webhook}

 

szymon_dybczak
Esteemed Contributor III

That's awesome @guptaharsh , thanks for sharing!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now