cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to write log entries from a Delta Live Table pipeline.

159312
New Contributor III

From a notebook I can import the log4j logger from cs and write to a log like so:

log4jLogger = sc._jvm.org.apache.log4j

LOGGER = log4jLogger.LogManager.getLogger(__name__)

LOGGER.info("pyspark script logger initialized")

But this does not work in a Delta Live Table Pipeline. I would like to be able to publish debugging messages to the log that appears when a pipeline runs. How can I get the logger in a DLT pipeline script?

2 REPLIES 2

159312
New Contributor III

Thank you Kaniz. I have seen that article, but it doesn't address how to write log entries which is what I am trying to do.

To record my own log messages would I write directly to `event_log_raw`? What is Databricks doing under the hood?

zero234
New Contributor III

hi @159312  and @Retired_mod 
Did u find any solution to this, to write custom log messages in DLT pipeline

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group