cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to add I custom logging in Databricks

User16869510359
Esteemed Contributor

I want to add custom logs that redirect in the Spark driver logs. Can I use the existing logger classes to have my application logs or progress message in the Spark driver logs.

1 ACCEPTED SOLUTION

Accepted Solutions

User16869510359
Esteemed Contributor

Set custom log messages in Scala:

%scala 
 
import org.slf4j.{Logger, LoggerFactory}
 
val customLogs = LoggerFactory.getLogger("CustomLogs")
 
customLogs.info("Testing INFO Logs")
 
customLogs.warn("Testing WARN Logs")
 
customLogs.error("Testing ERROR Logs")
 
print("This message will be in stdout instead of log4j")

Set custom logs messages in Python

%python
 
log4jLogger = spark.sparkContext._jvm.org.apache.log4j 
 
customLogs = log4jLogger.LogManager.getLogger("CustomLogs") 
 
 
customLogs.info("Testing INFO Logs")
 
customLogs.warn("Testing WARN Logs")
 
customLogs.error("Testing ERROR Logs")
 
print("This message will be in stdout instead of log4j")

View solution in original post

3 REPLIES 3

User16869510359
Esteemed Contributor

Set custom log messages in Scala:

%scala 
 
import org.slf4j.{Logger, LoggerFactory}
 
val customLogs = LoggerFactory.getLogger("CustomLogs")
 
customLogs.info("Testing INFO Logs")
 
customLogs.warn("Testing WARN Logs")
 
customLogs.error("Testing ERROR Logs")
 
print("This message will be in stdout instead of log4j")

Set custom logs messages in Python

%python
 
log4jLogger = spark.sparkContext._jvm.org.apache.log4j 
 
customLogs = log4jLogger.LogManager.getLogger("CustomLogs") 
 
 
customLogs.info("Testing INFO Logs")
 
customLogs.warn("Testing WARN Logs")
 
customLogs.error("Testing ERROR Logs")
 
print("This message will be in stdout instead of log4j")

Adding more clarity for people using this. The log files will be located in dbfs:/cluster-logs/<cluster_id>/driver/log4j-active.log

@User16869510359 is there a way to get the custom logs saved in its own .log file? Im guessing we would have to manually connect to azure blob and store it there (via azure-storage-logging python library)

Kaizen
Contributor III

1) Is it possible to save all the custom logging to its own file? Currently it is being logging with all other cluster logs (see image) 

2) Also Databricks it seems like a lot of blank files are also being created for this. Is this a bug? this includes the stderr and stdout files in the driver  @Debayan 

Kaizen_0-1707502477163.png

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.