cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Email notification to end users

eballinger
Contributor

Is there a way a way we can notify all of our databricks end users by email when there is a issue? We currently have our jobs setup to notify the technical team when a job workflow fails. That part works fine.

But we would like the ability to maybe use a python notebook and api perhaps to send a custom message to all of our end users when there are issues. For example sometimes there is issues downstream and our jobs are not run so we would like to be able to run in these cases something that would send a email to "All databricks users" with a custom message. 

I know we can maintain a user list to notify and send this manually like we do in other systems. But over time users come and go and being able to send a simple message to current active users would be best.

If this is not possible this would be a good feature for the future. 

Thanks

5 REPLIES 5

lingareddy_Alva
Honored Contributor II

Hi @eballinger 

You're looking for a way to notify all active Databricks users via email,
which is a common operational need. While Databricks doesn't have a built-in "notify all users" feature,
you can achieve this using the Databricks API to get active users and then send emails.

Key Features of This Solution
1. Automatic User Discovery
-- Uses Databricks SCIM API to fetch all workspace users
-- Filters for active users only
-- No need to maintain separate user lists

2. Flexible Notification Options
-- Send to all users or filter by recent activity
-- Support for both text and HTML emails
-- Personalized messages with user names

3. Built-in Safety Features
-- Test mode to preview recipients before sending
-- Batch email sending to avoid SMTP limits
-- Error handling and logging

Setup Requirements
1. Databricks Token
-- Create a personal access token with appropriate permissions:
2. Email Configuration
-- Set up SMTP credentials

3.Required Permissions
Your Databricks token needs
-- User management permissions
-- SCIM API access (usually requires admin role)


Alternative Approaches:
1. Slack Integration (if your org uses Slack)
2. Microsoft Teams Integration

This solution gives you exactly what you need - the ability to automatically notify all current active Databricks users
without manually maintaining user lists. The system automatically discovers users and handles the complexity of email
delivery while providing safety features for testing and monitoring.

 

LR

Isi
Contributor III

Hey @eballinger ,

 

Step 1: Add a Notification Task to Your Workflow

The first thing you should do is add an extra task to your Databricks job/Airflow Dag/etc. , and set its dependency to "at least one failed".

This way, if any upstream task fails, the notification task will automatically be triggered and can handle alerting users.

 

Step 2: Get the List of Active Users

Databricks exposes a SCIM API endpoint you can use to retrieve all users in your workspace:

GET /api/2.0/preview/scim/v2/Users

From the API response, simply filter for users where "active": true, and extract their email addresses, ensures you’re always notifying the current active users.

 

Step 3: Send Emails the Right Way

If you send emails directly from the notebook using something like smtplib, or via an SMTP server without proper domain authentication, your emails are likely to end up in spam, or even get blocked entirely.

That’s why I strongly recommend:

Authenticating from your notebook against Microsoft Graph (or your corporate email provider) and sending the emails from there.

This ensures:

  • Emails come from a trusted, legitimate corporate identity.

  • SPF/DKIM/DMARC policies are respected.

  • No reliance on insecure third-party servers.

 

Summary of the Flow:

  1. Job fails → notification task is triggered.

  2. Notebook calls the SCIM API to get active users.

  3. Authenticates to Microsoft Graph with proper permissions.

  4. Builds a custom email (text or HTML).

  5. Sends it to all current users in the workspace.

Hope this helps 🙂

Isi

ciro
New Contributor II

Having trouble ensuring consistent email alerts reach end-users on time. The current setup feels limited, especially when notifying people outside Databricks. Would appreciate any tips or workarounds that have worked for others.

Isi
Contributor III

@ciro 

Could you explain your situation in more detail? Perhaps you could open another question to explain it and then we can discuss how to resolve it?

Best Regards, 

Isi

eballinger
Contributor

Thanks LRALVA & Isi,

I like both of your suggestions. I did look into making my own notebook using smtplib but stopped because I do not know any open SMTP server or cloud email service in the Azure cloud environment. This is why I was hoping to leverage something within Databricks natively because I can have a task email me when a workflow fails so I just would like to extend that built in functionality and use whatever email service that is using. Is that not possible?

Thanks again for the great suggestions. If I had more control in my environment I would certainly look into those suggestions more.  

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now