cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to write log entries from a Delta Live Table pipeline.

159312
New Contributor III

From a notebook I can import the log4j logger from cs and write to a log like so:

log4jLogger = sc._jvm.org.apache.log4j

LOGGER = log4jLogger.LogManager.getLogger(__name__)

LOGGER.info("pyspark script logger initialized")

But this does not work in a Delta Live Table Pipeline. I would like to be able to publish debugging messages to the log that appears when a pipeline runs. How can I get the logger in a DLT pipeline script?

3 REPLIES 3

Kaniz
Community Manager
Community Manager

Hi @Ben Bogart​, The event log for each pipeline is stored in a Delta table in DBFS. You can view event log entries in the Delta Live Tables user interface, the Delta Live Tables API, or by directly querying the Delta table.

This article focuses on querying the Delta table.

The example notebook includes queries discussed in this article and can be used to explore the Delta Live Tables event log.

The examples in this article use JSON SQL functions available in Databricks Runtime 8.1 or higher.

159312
New Contributor III

Thank you Kaniz. I have seen that article, but it doesn't address how to write log entries which is what I am trying to do.

To record my own log messages would I write directly to `event_log_raw`? What is Databricks doing under the hood?

zero234
New Contributor III

hi @159312  and @Kaniz 
Did u find any solution to this, to write custom log messages in DLT pipeline