โ03-29-2023 02:53 AM
We want to use existing databricks smtp server or if databricks api can used to send custom emails. Databricks Workflows sends email notifications on success, failure, etc. of jobs but cannot send custom emails. So we want to send custom emails to differentiate databricks environment whether email notifications are received from Dev, QA, Stage or Prod environment.
What are other possible ways to send emails? Please advise.
I have used python smtplib package to send message but got error ConnectionRefusedError: [Errno 111] Connection refused at smtpObj = smtplib.SMTP('localhost')
import smtplib
sender = 'abc@gcd.com'
receivers = ['myname@company.com']
message = """From: From Person <abc@gcd.com>
To: To Person <myname@company.com>
Subject: SMTP e-mail test
This is a test e-mail message.
"""
try:
smtpObj = smtplib.SMTP('localhost')
smtpObj.sendmail(sender, receivers, message)
print ("Successfully sent email")
except SMTPException:
print ("Error: unable to send email")
โ03-29-2023 08:11 AM
@Krishna Prasadโ Depending on the emails and how you want them to be sent, you could check out Databricks Alerts where you can send alerts depending on custom logic with a Delta table. You could use Jobs to combine the notebooks and SQL alerts to raise alerts if a notebook failed or something.
โ03-29-2023 08:51 PM
Hi @Krishna Prasadโ
Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.
Please help us select the best solution by clicking on "Select As Best" if it does.
Your feedback will help us ensure that we are providing the best possible service to you.
Thank you!
โ10-22-2024 11:14 PM
Currently, the Alert service only allow to send a fix email list. can we send dynamic based on the a column in the dataset ? e.g. i have a dataset which includes a email column, can i send email to each row based on columns ?
4 weeks ago
Hello. Did you figure this out? We have a similar problem and this approach could potentially be a nice solution.
3 weeks ago - last edited 3 weeks ago
So you could do this via the api and combine with alerts. The challenge is obtaining the query_ids needed to create the notification and the notification destinations themselves - it would go something like this -
- Save your queries as actual "queries" in Databricks
- Create a table storing your metadata, so the query display name, email recipients, a column for the trigger column and a column for the query_id
- Use the notification_destination api, iterate through your table and create a destination per email in your table
- Use the queries api, iterate through the queries you own using
Wednesday
Hey - i ended up writing a blog on this, hopefully a simpler way of doing things than I thought ๐
https://sqlofthenorth.blog/2025/03/04/data-driven-notifications-with-databricks/
Wednesday
Hello @mido1978
I am also in need of something similar.
I have two tables one has the details of the recipient and another has some log data which are identified by a common key. Emails need be sent with the mentioned log to the recipient for whom it's applicable.
Please share the POC/code when it's ready๐.
Wednesday
hey @pk13 see my reply this morning, hope this is of use for you?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group