- 13197 Views
- 3 replies
- 6 kudos
I want to add custom logs that redirect in the Spark driver logs. Can I use the existing logger classes to have my application logs or progress message in the Spark driver logs.
- 13197 Views
- 3 replies
- 6 kudos
Latest Reply
1) Is it possible to save all the custom logging to its own file? Currently it is being logging with all other cluster logs (see image) 2) Also Databricks it seems like a lot of blank files are also being created for this. Is this a bug? this include...
2 More Replies
- 6875 Views
- 5 replies
- 4 kudos
I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch. For example: df = spark.read.json(".......................")logger.info("Successfully ingested data from json")Has someone succeeded in doing this before...
- 6875 Views
- 5 replies
- 4 kudos
Latest Reply
Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how...
4 More Replies
by
vs_29
• New Contributor II
- 2954 Views
- 1 replies
- 3 kudos
- 2954 Views
- 1 replies
- 3 kudos
Latest Reply
Hi @VIjeet Sharma , Do you receive any error? This can be an issue using DBFS mount point /dbfs in an init script: the DBFS mount point is installed asynchronously, so at the very beginning of init script execution, that mount point might not be ava...
- 2145 Views
- 0 replies
- 2 kudos
I will appreciate if there is a black friday deal on the Databricks Data Engineering Associate course or if I can get a personal coupon.
- 2145 Views
- 0 replies
- 2 kudos