cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Show log4j messages in run output

Dom1
New Contributor II
Hi,

I have an issue when running JAR jobs. I expect to see logs in the output window of a run. Unfortunately, I can only see messages of that are generated with "System.out.println" or "System.err.println". Everything that is logged via slf4j is only shown in the log4j-Logs of the cluster. 
Dom1_0-1713189014582.png
Is there a way to show the logs of slf4j in the output window of the executed run? When we log in python then those logs are shown, so I hope that this is also possible for Java code.
 
Thanks for your help

Dominik 
4 REPLIES 4

Dom1
New Contributor II

Hi @Retired_mod 

thanks for your response. I just build a minimal project that looks like this:

package logging;

import lombok.extern.slf4j.Slf4j;

@Slf4j
public class TestLogging {

	public static void main(String[] args) {
		log.trace("Test: Trace message");
		log.debug("Test: Debug message");
		log.info("Test: Info message");
		log.info("Test: New test with log4j2");
		log.warn("Test: Warning message");
		log.error("Test: Error message");
	}
}

So just a simple main method that logs on different levels. My pom dependencies look like this:

	<dependencies>

		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-api</artifactId>
			<version>2.23.1</version>
		</dependency>

		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-core</artifactId>
			<version>2.23.1</version>
		</dependency>

		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-slf4j2-impl</artifactId>
			<version>2.23.1</version>
		</dependency>

		<dependency>
			<groupId>org.slf4j</groupId>
			<artifactId>slf4j-api</artifactId>
			<version>2.0.12</version>
		</dependency>

		<dependency>
			<groupId>org.projectlombok</groupId>
			<artifactId>lombok</artifactId>
			<version>1.18.30</version>
			<scope>provided</scope>
		</dependency>
	</dependencies>

And here my log4j2.xml file

<?xml version="1.0" encoding="UTF-8"?>
<Configuration>
	<Appenders>

		<Console name="console" target="SYSTEM_OUT">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n" />
		</Console>

	</Appenders>
	<Loggers>
		<Root level="INFO">
			<AppenderRef ref="console" />
		</Root>
	</Loggers>
</Configuration>

When I run this on my local machine, then it shows the following logs 

2024-04-18 14:33:00 [main] INFO  logging.TestLogging - Test: Info message
2024-04-18 14:33:00 [main] INFO  logging.TestLogging - Test: New test with log4j2
2024-04-18 14:33:00 [main] WARN  logging.TestLogging - Test: Warning message
2024-04-18 14:33:00 [main] ERROR logging.TestLogging - Test: Error message

But when I package this as a jar, upload it to databricks and run the code then it does not print anything in the output window of the task. I see some logs in the Log4j output of the cluster but with a different logging pattern:

24/04/18 12:24:10 INFO TestLogging: Test: Info message
24/04/18 12:24:10 INFO TestLogging: Test: New test with log4j2
24/04/18 12:24:10 WARN TestLogging: Test: Warning message
24/04/18 12:24:10 ERROR TestLogging: Test: Error message

So I assume the logging settings of my jar is replaced somewhere. I don´t know how I can show the log messages into the output of the task run (or redirect them into one file for this specific run).

Hope the examples show what I am trying to accomplish 😉

Wolvi
New Contributor II

Hi Fatma,

thanks a lot for your reply. Unfortunately, I have the same problem as Dom1. The standardout is visible in the output tab of each task, while logging messages are not displayed. I considered all of your solution hints, but the behaviour persists. My minimal example, adapted from above, looks like this. These are the dependencies in the pom.xml (I also tried with slf4j-simple as you suggested):

<dependencies>
<!-- SLF4J API -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.30</version>
</dependency>
<!-- Logback Classic (includes SLF4J binding) -->
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
</dependencies>

And this is the code contained in the jar:

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class TestLogging {
private static final Logger logger = LoggerFactory.getLogger(TestLogging.class);

public static void main(String[] args) {
logger.info("This is an INFO message.");
logger.warn("This is a WARN message.");
logger.error("This is an ERROR message.");
}
}

Is it possible that it is an issue with log4j 2? I have version 2.20 installed. Help would be much appreciated as the missing logs make debugging very cumbersome.

Thanks a lot!

Wolvi
New Contributor II

Follow-Up: It seems that the wrong libraries are installed in the background. When i upload my jar and start the task other versions are displayed, i.e., locally I pack the following versions into my jar:

Log4j API version: 2.23.1
Log4j Core version: 2.23.1
SLF4J API version: 2.0.7
SLF4J to Log4j Binding version: 2.20.0
SLF4J Simple version: 2.0.0

However, when i run the task, the following is shown:

NoClassDefFoundError: org/slf4j/simple/SimpleLogger Caused by: ClassNotFoundException: org.slf4j.simple.SimpleLogger
at TestLogging.main(TestLogging.java:22)
Log4j API version: 2.20.0
Log4j Core version: 2.20.0
SLF4J API version: 2.0.7
SLF4J to Log4j Binding version: 1.7.25
 
It seems that in the background other libraries are loaded, which could be a problem as Slf4j to log4j has a version potentially not compatible with the other libraries.

dbal
New Contributor III

Any update on this? I am also facing this issue.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group