cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Sending email from databricks to google drive attachement

AryaMa
New Contributor III

https://stackoverflow.com/questions/67088891/send-email-from-databricks-notebook-with-attachment

i have to send the attachment to the organisation google drive folder directly instead of email any suggestions

sample email with attachement code

msg.attach(MIMEText(message))
 
    for path in files:
        part = MIMEBase('application', "octet-stream")
        with open(path, 'rb') as file:
            part.set_payload(file.read())
        encoders.encode_base64(part)
        part.add_header('Content-Disposition',
                        'attachment; filename="{}"'.format(Path(path).name))
        msg.attach(part)
 
    smtp = smtplib.SMTP(server, port)
    if use_tls:
        smtp.starttls()
    smtp.login(username, password)
    smtp.sendmail(send_from, send_to, msg.as_string())
    smtp.quit()

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

Maybe just use azure logic apps or power automate (trigger as http requests with json and then all actions there)

AryaMa
New Contributor III

we are not using use azure, we read data from s3 and have AWS infrastructure in backend, all spark jobs run on EMR through lambda and load data to s3, right now we are using databricks for reconciliation of data

Hubert-Dudek
Esteemed Contributor III

yes I understand that you use AWS but google drive is also external service. Making post request from aws to logic apps in azure which than save it to google drive is just really easy way, efficient, quick to deploy and easy to monitor.

Another option is to use lambda in AWS, so your script from databricks also call HTTP (just lambda endpoint not logic apps) and than lambda script can use Google drive API https://developers.google.com/drive/api/v3/reference/files/create.

There are also many other 3rd party or other solutions as well just this 2 are the ones which I used.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.