From a notebook I can import the log4j logger from cs and write to a log like so:
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger(__name__)
LOGGER.info("pyspark script logger initialized")
But this does not work in a Delta Live Table Pipeline. I would like to be able to publish debugging messages to the log that appears when a pipeline runs. How can I get the logger in a DLT pipeline script?