02-08-2023 07:12 AM
I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch.
For example:
df = spark.read.json(".......................")
logger.info("Successfully ingested data from json")
Has someone succeeded in doing this before? Your help is much appreciated. Thanks in advance!
02-09-2023 11:33 AM
02-09-2023 09:32 PM
Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/
02-10-2023 01:52 AM
Hello @Debayan Mukherjee ,
Thanks for your reply and it was helpful.
To redirect log files from s3 to cloudwatch - I can use boto3 (for example :https://stackoverflow.com/questions/59147344/sending-emr-logs-to-cloudwatch)
However do we have any native integration with AWS cloudwatch?
02-12-2023 09:39 PM
Hi, the only native integration we have is : https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/.
Thank you!
02-13-2023 07:05 AM
Thanks for the Response!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group