cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to send automated emails from Databricks notebooks based on conditions or events?

Akshay_Petkar
Valued Contributor

Hi everyone,

I’m currently exploring how to replicate something similar to Alteryx Email Activity within Databricks.

Basically, I want to send automated emails to specific users when certain conditions or events occur in a notebook workflow  for example:

  • If a data validation check fails

  • If a specific event or threshold is triggered

  • Or based on an output result from a query or job

Is there any native or recommended approach in Databricks to achieve this?
I’m thinking of something like:

“If this event happens → send an email to that person automatically.”

Has anyone implemented this kind of conditional email notification system in Databricks?
Would appreciate any suggestions or best practices for setting this up.

Thanks,

Akshay Petkar
1 REPLY 1

HariSankar
Contributor III

Hey @Akshay_Petkar ,

This is something a lot of people try to do when they move workflows from Alteryx or SSIS into Databricks. There isn’t a direct “Email Activity” node like in Alteryx, but you can
definitely set up automated email notifications in a Databricks notebook based on any condition or event.

Here’s how you can approach it.


1).Sending Emails Directly from a Notebook (Simple and Common Approach)

If you just want to trigger an email when a data validation fails or a threshold is crossed, you can do it right inside your notebook using Python’s built-in smtplib.

You’d typically:

--> Store your email credentials in Databricks Secrets (never hardcode passwords).
--> Use a simple helper function that sends an email whenever your condition is met.
--> This approach is straightforward and works well for small-scale alerts.


Example:

import smtplib
from email.mime.text import MIMEText
from pyspark.sql import functions as F

def send_email(subject, body, recipients):
sender = "yourmail@abc.com"
password = dbutils.secrets.get("email_scope", "email_password")

msg = MIMEText(body)
msg["Subject"] = subject
msg["From"] = sender
msg["To"] = ", ".join(recipients)

with smtplib.SMTP("smtp.office365.com", 587) as server:
server.starttls()
server.login(sender, password)
server.sendmail(sender, recipients, msg.as_string())

# Example: send alert if data validation fails
df = spark.read.table("sales_data")
invalid_count = df.filter(F.col("amount").isNull()).count()

if invalid_count > 0:
send_email(
"Data Validation was Failed",
f"{invalid_count} records have null values in 'amount' column.",
["datateam@company.com"]
)
----------------------------------------------------------------------------------------------------------------------------------------------
2)Using Databricks Job Notifications

If you just need notifications when a job succeeds, fails, or completes — you can set that up directly from the Job UI under “Notifications”.

That’s the quickest option, but it only covers job-level outcomes, not conditional alerts inside your notebook.

----------------------------------------------------------------------------------------------------------------------------------------------
3) Using Logic Apps, Power Automate, or AWS SNS for Scalable Alerting

For production use cases where you want to centralize alerting, you can integrate Databricks with something like Azure Logic Apps or Power Automate.

For example:

Your notebook writes a small record to a Delta table or calls a webhook whenever a certain condition happens.

The Logic App picks it up and sends an email or Teams message automatically.

This keeps your notebooks clean and avoids managing email credentials in Databricks.
---------------------------------------------------------------------------------------------------------------------------------------------------

There is no native “if this event → send email” node like in Alteryx, but with a few lines of Python and Databricks Secrets, you can easily build your own lightweight version of that behavior.
If you need something more scalable, hook it up with Logic Apps, and let that handle the notifications for you.

Hope this helps

harisankar