cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

how to generate log files on specific folders

aupres
New Contributor III

Hello! My environments are like below,

 

OS : Windows 11

Spark : spark-4.0.0-preview2-bin-hadoop3

 

And the configuration of spark files 'spark-defaults.conf' and 'log4j2.properties'

spark-defaults.conf

 

spark.eventLog.enabled             true
spark.eventLog.dir                 file:///C:/spark-4.0.0-preview2-bin-hadoop3/sparkeventlogs
spark.serializer                   org.apache.spark.serializer.KryoSerializer
spark.driver.memory                5g
spark.yarn.am.memory               1g
spark.executor.instances           1

spark.executor.extraJavaOptions        -Dlog4j.configuration=file:///C:/spark-4.0.0-preview2-bin-hadoop3/conf/log4j.properties
spark.driver.extraJavaOptions        -Dlog4j.configuration=file:///C:/spark-4.0.0-preview2-bin-hadoop3/conf/log4j.properties

 

log4j2.properties

 

rootLogger = INFO, console, file
rootLogger.level = info
rootLogger.appenderRef.stdout.ref = console, file
rootLogger.appenderRef.file.ref = FileAppender

appender.file=org.apache.log4j.RollingFileAppender 
appender.file.File=C:/spark-4.0.0-preview2-bin-hadoop3/logs
appender.file.MaxFileSize=10MB 
appender.file.MaxBackupIndex=10 
appender.file.layout=org.apache.log4j.PatternLayout 
appender.file.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

 

But log files are not generated on C:/spark-4.0.0-preview2-bin-hadoop3/logs folder. Any idea?

 

1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @aupres,

Do you see any failures in spark logs?

Few things to validate:

It appears that the log files are not being generated in the specified directory due to a misconfiguration in your log4j2.properties fil

 

Check the Appender Configuration:

Ensure that the appender.file is correctly configured to write logs to the specified directory. The File property should point to the exact file path where you want the logs to be written.

 

appender.file=org.apache.log4j.RollingFileAppender

appender.file.File=C:/spark-4.0.0-preview2-bin-hadoop3/logs/spark.log

appender.file.MaxFileSize=10MB

appender.file.MaxBackupIndex=10

appender.file.layout=org.apache.log4j.PatternLayout

appender.file.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

Ensure Correct Log4j2 Configuration: Verify that the log4j2.properties file is being correctly referenced in your Spark configuration. The paths specified in spark.executor.extraJavaOptions and spark.driver.extraJavaOptions should point to the correct log4j2.properties file.

spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///C:/spark-4.0.0-preview2-bin-hadoop3/conf/log4j2.properties

spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///C:/spark-4.0.0-preview2-bin-hadoop3/conf/log4j2.properties

Verify Directory Permissions: Ensure that the directory C:/spark-4.0.0-preview2-bin-hadoop3/logs exists and that the user running the Spark application has write permissions to this directory

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group