Here are some suggestions for your consideration.
The issue with your custom logging setup seems to stem from attempting to save the log files in a path under "/Workspace/Users/ramya.v@point32health.org/CD/"
, which is not directly writable by your code in Databricks. Databricks workspaces utilize DBFS (Databricks File System), and regular filesystem paths like /Workspace
operate differently within this environment.
To solve this issue and ensure that dynamic custom log files are correctly created and stored with each notebook run, you should:
-
Use DBFS for the Log Path: Update the log_dir
variable to use a directory within DBFS. For instance: python
log_dir = "/dbfs/Workspace/Users/your.email@databricks.com/logs/"
Replace "your.email@databricks.com"
with your actual workspace email.
-
Ensure the Folder Exists: Before attempting to create log files, ensure the directory exists on DBFS. You can create the folder programmatically: python
import os
log_dir = "/dbfs/Workspace/Users/your.email@databricks.com/logs/"
if not os.path.exists(log_dir):
os.makedirs(log_dir)
-
Update Logging Configuration: Modify your logging configuration to use DBFS for storing logs: ```python from datetime import datetime import logging import os
# Define the log directory and create if necessary log_dir = "/dbfs/Workspace/Users/louis.frolio@databricks.com/logs/" if not os.path.exists(log_dir): os.makedirs(log_dir)
# Create a timestamp for the log file timestamp = datetime.now().strftime('%Y%m%d_%H%M%S') log_filename = os.path.join(log_dir, f'notebooklog{timestamp}.log')
# Configure the logging logging.basicConfig( filename=log_filename, level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s' )
# Create a logger object logger = logging.getLogger() logger.info("Logging has been initialized successfully.") ```
-
Run and Verify: After making these changes, run the notebook. Verify the logs by navigating to the /Workspace/Users/louis.frolio@databricks.com/logs/
path in Databricks.
-
DBFS Path Accessibility: After running the notebook, you can access and download the logs through the Databricks UI or use dbutils
to manage the files. For example: python
files = dbutils.fs.ls("dbfs:/Workspace/Users/louis.frolio@databricks.com/logs/")
display(files)
By following these steps, the custom log files will be correctly generated and stored dynamically in a location accessible for further inspection. This approach ensures compatibility with Databricks' storage environment.
Cheers, Lou.