02-08-2023 07:12 AM
I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch.
For example:
df = spark.read.json(".......................")
logger.info("Successfully ingested data from json")
Has someone succeeded in doing this before? Your help is much appreciated. Thanks in advance!
02-09-2023 11:33 AM
02-09-2023 09:32 PM
Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/
02-10-2023 01:52 AM
Hello @Debayan Mukherjee ,
Thanks for your reply and it was helpful.
To redirect log files from s3 to cloudwatch - I can use boto3 (for example :https://stackoverflow.com/questions/59147344/sending-emr-logs-to-cloudwatch)
However do we have any native integration with AWS cloudwatch?
02-12-2023 09:39 PM
Hi, the only native integration we have is : https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/.
Thank you!
02-13-2023 07:05 AM
Thanks for the Response!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.