cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks driver logs

sagiatul
New Contributor II

I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluster terminates, I am unable to see those logs. I get below screen

image

2 REPLIES 2

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, when you say the job is finished and the cluster terminates, can you still list the cluster in the cluster UI page? Also, could you please uncheck the auto-fetch logs?

Until and unless the cluster is deleted the logs should be there.

Please tag @Debayan​ with your next response which will notify me, Thank you!

Anonymous
Not applicable

Hi @Atul Arora​ 

Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.

Please help us select the best solution by clicking on "Select As Best" if it does.

Your feedback will help us ensure that we are providing the best possible service to you.

Thank you!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.