cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Custom Log4j logs are not being written to the DBFS storage.

vs_29
New Contributor II

 I used custom Log4j appender to write the custom logs through the init script and I can see the Custom Log file on the Driver logs but Databricks is not writing those custom logs to the DBFS. I have configured Logging Destination in the Advanced section of the cluster properties. Attaching the screenshots of the init script, Databricks Driver logs, init scriptinit scriptdriver logslogs destination

2 REPLIES 2

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @VIjeet Sharma​ , Do you receive any error?

This can be an issue using DBFS mount point /dbfs in an init script: the DBFS mount point is installed asynchronously, so at the very beginning of init script execution, that mount point might not be available yet and hence the logs are not getting updated. Please let us know if I understood it correctly.

Kaniz
Community Manager
Community Manager

Hi @VIjeet Sharma​, We haven’t heard from you since the last response from @Debayan Mukherjee​ and I was checking back to see if his suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.