cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to write log entries from a Delta Live Table pipeline.

159312
New Contributor III

From a notebook I can import the log4j logger from cs and write to a log like so:

log4jLogger = sc._jvm.org.apache.log4j

LOGGER = log4jLogger.LogManager.getLogger(__name__)

LOGGER.info("pyspark script logger initialized")

But this does not work in a Delta Live Table Pipeline. I would like to be able to publish debugging messages to the log that appears when a pipeline runs. How can I get the logger in a DLT pipeline script?

2 REPLIES 2

159312
New Contributor III

Thank you Kaniz. I have seen that article, but it doesn't address how to write log entries which is what I am trying to do.

To record my own log messages would I write directly to `event_log_raw`? What is Databricks doing under the hood?

zero234
New Contributor III

hi @159312  and @Retired_mod 
Did u find any solution to this, to write custom log messages in DLT pipeline

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now