cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

digui
by New Contributor
  • 5452 Views
  • 3 replies
  • 0 kudos

Issues when trying to modify log4j.properties

Hi y'all.​I'm trying to export metrics and logs to AWS cloudwatch, but while following their tutorial to do so, I ended up facing this error when trying to initialize my cluster with an init script they provided.This is the part where the script fail...

  • 5452 Views
  • 3 replies
  • 0 kudos
Latest Reply
cool_cool_cool
New Contributor II
  • 0 kudos

@digui Did you figure out what to do? We're facing the same issue, the script works for the executors.I was thinking on adding an if that checks if there is log4j.properties and modify it only if it exists

  • 0 kudos
2 More Replies
Murthy1
by Contributor II
  • 5992 Views
  • 5 replies
  • 4 kudos

Send custom logs to AWS cloudwatch from Notebook

I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch. For example: df = spark.read.json(".......................")logger.info("Successfully ingested data from json")Has someone succeeded in doing this before...

  • 5992 Views
  • 5 replies
  • 4 kudos
Latest Reply
Debayan
Databricks Employee
  • 4 kudos

Hi, You can integrate, please refer: https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/ and also you can configure audit logging to S3 and redirect it to cloudwatch from AWS. , refer: https://aws.amazon.com/blogs/mt/how...

  • 4 kudos
4 More Replies
IdanYaffe
by New Contributor II
  • 1335 Views
  • 1 replies
  • 2 kudos

Cannot find an AWS Cloudwatch init script that supports runtime 11.x

Hi all,I'm using the AWS CW init global script in order to monitor my clusters' instances.I'm also using data live tables with some autoloader jobs.Unfortunately, the data live tables are now running runtime version 11.As a result, newly created pipe...

  • 1335 Views
  • 1 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Unfortunately, in delta live tables, you can not specify runtime (except current and preview, which you mentioned). It could be helpful that DLT runtimes releases are mentioned on the databricks side the same way as SQL, ML, and standard ones @Kaniz ...

  • 2 kudos
Labels