How to write log entries from a Delta Live Table pipeline.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-07-2022 11:22 AM
From a notebook I can import the log4j logger from cs and write to a log like so:
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger(__name__)
LOGGER.info("pyspark script logger initialized")
But this does not work in a Delta Live Table Pipeline. I would like to be able to publish debugging messages to the log that appears when a pipeline runs. How can I get the logger in a DLT pipeline script?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-11-2022 06:43 AM
Thank you Kaniz. I have seen that article, but it doesn't address how to write log entries which is what I am trying to do.
To record my own log messages would I write directly to `event_log_raw`? What is Databricks doing under the hood?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-07-2024 09:13 AM
hi @159312 and @Retired_mod
Did u find any solution to this, to write custom log messages in DLT pipeline

