- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-23-2021 11:37 PM
I want to add custom logs that redirect in the Spark driver logs. Can I use the existing logger classes to have my application logs or progress message in the Spark driver logs.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2021 01:35 AM
Set custom log messages in Scala:
%scala
import org.slf4j.{Logger, LoggerFactory}
val customLogs = LoggerFactory.getLogger("CustomLogs")
customLogs.info("Testing INFO Logs")
customLogs.warn("Testing WARN Logs")
customLogs.error("Testing ERROR Logs")
print("This message will be in stdout instead of log4j")
Set custom logs messages in Python
%python
log4jLogger = spark.sparkContext._jvm.org.apache.log4j
customLogs = log4jLogger.LogManager.getLogger("CustomLogs")
customLogs.info("Testing INFO Logs")
customLogs.warn("Testing WARN Logs")
customLogs.error("Testing ERROR Logs")
print("This message will be in stdout instead of log4j")
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2021 01:35 AM
Set custom log messages in Scala:
%scala
import org.slf4j.{Logger, LoggerFactory}
val customLogs = LoggerFactory.getLogger("CustomLogs")
customLogs.info("Testing INFO Logs")
customLogs.warn("Testing WARN Logs")
customLogs.error("Testing ERROR Logs")
print("This message will be in stdout instead of log4j")
Set custom logs messages in Python
%python
log4jLogger = spark.sparkContext._jvm.org.apache.log4j
customLogs = log4jLogger.LogManager.getLogger("CustomLogs")
customLogs.info("Testing INFO Logs")
customLogs.warn("Testing WARN Logs")
customLogs.error("Testing ERROR Logs")
print("This message will be in stdout instead of log4j")
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-06-2024 01:38 PM
Adding more clarity for people using this. The log files will be located in dbfs:/cluster-logs/<cluster_id>/driver/log4j-active.log
@brickster_2018 is there a way to get the custom logs saved in its own .log file? Im guessing we would have to manually connect to azure blob and store it there (via azure-storage-logging python library)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2024 10:18 AM - edited 02-09-2024 10:24 AM
1) Is it possible to save all the custom logging to its own file? Currently it is being logging with all other cluster logs (see image)
2) Also Databricks it seems like a lot of blank files are also being created for this. Is this a bug? this includes the stderr and stdout files in the driver @Debayan