cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Loguru doesn't save logs to Databricks volume

Johannes_E
New Contributor II

I've added an external volume named "logs" to my Databricks Unity Catalog. Within a Databricks notebook I can verify that it exists (os.path.exists(path='/Volumes/my_catalog/schema_name/logs') and even write a file to it that I can see within the Databricks UI using the following syntax:

with open('/Volumes/my_catalog/schema_name/logs/test_file.txt', 'w') as file:
    file.write('my test')

 

But when I try to use loguru to write logs to the volume they don't appear within the volume. I've used the following syntax:

from datetime import datetime
import os

from loguru import logger

LOGS_FOLDER_PATH = '/Volumes/my_catalog/schema_name/logs/'
DATE_TIME_STRING = datetime.now().strftime('%Y-%m-%d__%H_%M_%S')

file_handler_path = LOGS_FOLDER_PATH + DATE_TIME_STRING + '.log'

logger.add(file_handler_path)
logger.info((f'Logging is set up including the file handler that saves the logs to the '
f'following destination: {file_handler_path}'))
logger.error('an error has happened')
logger.warning('my warning')
logger.info('my info')



The first logger.info statement returns the following path as the destination: /Volumes/my_catalog/schema_name/logs/2025-04-30__10_01_24.log

But within the Databricks UI I can't see a log file! os.listdir(path='/Volumes/my_catalog/schema_name/logs') tells me that the file exists but after restarting my cluster it tells me the file does not exist any more (so probably it has never existed at all).

When I add a folder to the path of the file handler (e. g. " /Volumes/my_catalog/schema_name/logs/additional_folder/2025-04-30__10_01_24.log" instead of " /Volumes/my_catalog/schema_name/logs/2025-04-30__10_01_24.log") the folder "additional_folder" is actually created within the external volume "logs" but still the log file is not created. So, this shows that loguru actually does something and has access to the external volume.

Within another Databricks project I've used to same syntax but the file handler path was a mounted Azure blob storage. In this project saving the logs worked. So, I guess that the problem is somewhere within the external volume that I added within the Databricks unity catalog...

 

2 REPLIES 2

Thomas_Zhang
New Contributor III

I am having the same problem. I am using a work-around currently but definitely would love to see a solution. 

FYI: here is my work-around:

logger.add(
f"{output_folder_path}/../logging/workflow_job1_{datetime_str}.log",
rotation='10 days',
retention="10 days",
level="DEBUG",
)

Unfortunately, your workaround does not work in my case 😞

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now