cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

I do not have any Spark jobs running, but my cluster is not getting auto-terminated.

User16869510359
Esteemed Contributor

The cluster is Idle and there are no Spark jobs running on the Spark UI. Still I see my cluster is active and not getting terminated.

1 ACCEPTED SOLUTION

Accepted Solutions

User16869510359
Esteemed Contributor

Databricks cluster is treated as active if there are any spark or non-Spark operations running on the cluster. Even though there are no Spark jobs running on the cluster, it's possible to have some driver-specific application code running marking the cluster as non-idle. Driver-specific application code may be something like connecting to a REST API Server, polling a database, etc.

One other use case, where the cluster will be marked non-idle is if the cluster has an active JDBC/ODBC connection that submits queries on a frequent interval.

View solution in original post

1 REPLY 1

User16869510359
Esteemed Contributor

Databricks cluster is treated as active if there are any spark or non-Spark operations running on the cluster. Even though there are no Spark jobs running on the cluster, it's possible to have some driver-specific application code running marking the cluster as non-idle. Driver-specific application code may be something like connecting to a REST API Server, polling a database, etc.

One other use case, where the cluster will be marked non-idle is if the cluster has an active JDBC/ODBC connection that submits queries on a frequent interval.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.