cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Pyspark logging - custom to Azure blob mount directory

kjoth
Contributor II

I'm using the logging module to log the events from the job, but it seems the log is creating the file with only 1 lines. The consecutive log events are not being recorded. Is there any reference for custom logging in Databricks.

1 ACCEPTED SOLUTION

Accepted Solutions

jose_gonzalez
Databricks Employee
Databricks Employee

hi @karthick Jโ€‹ ,

If you want to create a custom logger, then you will need to use log4j to create your logger. The first post will show you how to do it. If you want to saved your captured events, then you will need to follow the second post that Kaniz has shared. You will need to parse your data when reading it back.

View solution in original post

4 REPLIES 4

Thanks for the answer.

Instead of cluster logs, i want log only specific program run logs to a file with rotating.Is there a way to write the logs to mounted blob storage directory with append mode. I have read that in data-bricks file write doesn't support append mode.

jose_gonzalez
Databricks Employee
Databricks Employee

hi @karthick Jโ€‹ ,

If you want to create a custom logger, then you will need to use log4j to create your logger. The first post will show you how to do it. If you want to saved your captured events, then you will need to follow the second post that Kaniz has shared. You will need to parse your data when reading it back.

Thank you for the answer

Anonymous
Not applicable

@karthick Jโ€‹ - If Jose's answer helped solve the issue, would you be happy to mark their answer as best so that others can find the solution more easily?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group