Databricks Log4J Custom Appender Not Working as expected
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ08-25-2022 01:01 AM
I'm trying to figure out how a custom appender should be configured in a Databricks environment but I cannot figure it out.
When cluster is running, in `driver logs`, time is displayed as 'unknown' for my custom log file and when cluster is stopped, custom log file is not displayed at all in the log files list
#appender configuration
log4j.appender.bplm=com.databricks.logging.RedactionRollingFileAppender
log4j.appender.bplm.layout=org.apache.log4j.PatternLayout
log4j.appender.bplm.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
log4j.appender.bplm.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}-bplm.log.gz
log4j.appender.bplm.rollingPolicy.ActiveFileName=logs/log4j-bplm.log
log4j.logger.com.myPackage=INFO,bplm
Above configuration was added to following files
- "/databricks/spark/dbconf/log4j/executor/log4j.properties"
- "/databricks/spark/dbconf/log4j/driver/log4j.properties"
- "/databricks/spark/dbconf/log4j/master-worker/log4j.properties"
After above configuration was added to above mentioned files, there are two issues which i cannot figure out.
1 - When cluster is running, if I go to `driver logs` in the list of log files, I can see my custom logfile generated, correctly populated, but time column is displayed as 'unknown'.
2 - When cluster is stopped, if I go to `driver logs` in the list of log files, my custom appender are not displayed. ( stdout, stderr, and log4j-active are displayed )
I also used different FileNamePatterns, but issues mentioned above seems to happens for any configuration I tried
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}.bplm.log.gz - appender1
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j.bplm-%d{yyyy-MM-dd-HH}.log.gz - appender2
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/bplm-log4j-%d{yyyy-MM-dd-HH}.log.gz - appender3
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/bplm.log4j-%d{yyyy-MM-dd-HH}.log.gz - appender4
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}.log.bplm.gz - appender5
log4j.appender.bplm7.rollingPolicy.FileNamePattern=logs/log4j-bplm-%d{yyyy-MM-dd-HH}.log.gz - appender7
log4j.appender.bplm8.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}-bplm.log.gz - appender8
I also tried to put *-active in ActiveFileName, but result was the same
log4j.appender.custom.rollingPolicy.FileNamePattern=/tmp/custom/logs/log4j-bplm-%d{yyyy-MM-dd-HH}.log.gz
log4j.appender.custom.rollingPolicy.ActiveFileName=/tmp/custom/logs/log4j-bplm-active.log
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-19-2022 10:25 PM
@Costi Chiulanโ What is the DBR version?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-20-2022 01:30 AM
9.1
10.5
10.4 LTS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ11-30-2022 01:38 AM
We're having the same problem with 11.3 LTS. Are there any updates?
We would like to deliver log4j messages from Databricks Notebooks to custom log files and then upload those to S3 or DBFS.
Best

