What is the best practice for logging in Databricks notebooks?
I have a bunch of notebooks that run in parallel through a workflow. I would like to keep track of everything that happens such as errors coming from a stream. I would like these logs to be maintained somewhere either in DBFS or in a storage account.
I got the built-in logging module working but you have to manually transfer the log file from a temp folder in file: to dbfs:/FileStore/log_folder/text.log. DBFS throws an error if the log file is directly assigned to its path with the FileHandler.
This basically works for my purposes but what is the actual best practice of doing it in Databricks?