cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Send custom logs to AWS cloudwatch from Notebook

Murthy1
Contributor II

I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch.

For example:

df = spark.read.json(".......................")

logger.info("Successfully ingested data from json")

Has someone succeeded in doing this before? Your help is much appreciated. Thanks in advance!

5 REPLIES 5

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/

Hello @Debayan Mukherjee​  ,

Thanks for your reply and it was helpful.

To redirect log files from s3 to cloudwatch - I can use boto3 (for example :https://stackoverflow.com/questions/59147344/sending-emr-logs-to-cloudwatch)

However do we have any native integration with AWS cloudwatch?

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, the only native integration we have is : https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/.

Thank you!

Thanks for the Response!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.