cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Pyspark logging - custom to Azure blob mount directory

kjoth
Contributor II

I'm using the logging module to log the events from the job, but it seems the log is creating the file with only 1 lines. The consecutive log events are not being recorded. Is there any reference for custom logging in Databricks.

1 ACCEPTED SOLUTION

Accepted Solutions

jose_gonzalez
Moderator
Moderator

hi @karthick J​ ,

If you want to create a custom logger, then you will need to use log4j to create your logger. The first post will show you how to do it. If you want to saved your captured events, then you will need to follow the second post that Kaniz has shared. You will need to parse your data when reading it back.

View solution in original post

6 REPLIES 6

Kaniz
Community Manager
Community Manager

Kaniz
Community Manager
Community Manager

Hi @karthick J​ , When you create your cluster in databricks, there is a tab where you can specify the log directory (empty by default).

Logs are written on DBFS, so you just have to specify the directory you want.

Screenshot 2021-11-11 at 6.44.57 PM 

kjoth
Contributor II

Thanks for the answer.

Instead of cluster logs, i want log only specific program run logs to a file with rotating.Is there a way to write the logs to mounted blob storage directory with append mode. I have read that in data-bricks file write doesn't support append mode.

jose_gonzalez
Moderator
Moderator

hi @karthick J​ ,

If you want to create a custom logger, then you will need to use log4j to create your logger. The first post will show you how to do it. If you want to saved your captured events, then you will need to follow the second post that Kaniz has shared. You will need to parse your data when reading it back.

Thank you for the answer

Anonymous
Not applicable

@karthick J​ - If Jose's answer helped solve the issue, would you be happy to mark their answer as best so that others can find the solution more easily?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.