cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Cluster logs missing

User16869510359
Esteemed Contributor

On the Databricks cluster UI, when I click on the Driver logs, sometimes I see historic logs and sometimes I see logs for the last few hours. Why do we see this inconsistency

1 ACCEPTED SOLUTION

Accepted Solutions

User16869510359
Esteemed Contributor

This is working per design! This is the expected behavior.

When the cluster is in terminated state, the logs are serviced by the Spark History server hosted on the Databricks control plane.

When the cluster is up and running the logs are serviced by the Spark Driver at that point in time.

Because of this architecture, when the cluster is in the terminated state you will see the logs for the last 30 days and when the cluster is up and running you will see the logs from the last restart/start of the cluster.

View solution in original post

1 REPLY 1

User16869510359
Esteemed Contributor

This is working per design! This is the expected behavior.

When the cluster is in terminated state, the logs are serviced by the Spark History server hosted on the Databricks control plane.

When the cluster is up and running the logs are serviced by the Spark Driver at that point in time.

Because of this architecture, when the cluster is in the terminated state you will see the logs for the last 30 days and when the cluster is up and running you will see the logs from the last restart/start of the cluster.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.