cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Driver is up but is not responsive, likely due to GC.

vamsivarun007
New Contributor II

Hi all,

"Driver is up but is not responsive, likely due to GC."

This is the message in cluster event logs. Can anyone help me with this. What does GC means? Garbage collection? Can we control it externally?

5 REPLIES 5

shyam_9
Valued Contributor

Hi @vamsivarun007,

Please go through the below KB article to resolve this issue,

https://kb.databricks.com/jobs/driver-unavailable.html

for what is GC, please check this answer,

https://forums.databricks.com/questions/14725/how-to-resolve-spark-full-gc-on-cluster-startup.html

Carlos_AlbertoG
New Contributor II

spark.catalog.clearCache() solve the problem for me 😉

Hi, I meet the seme problem when I train a DeepLearning model. Could you tell me where to set this 'spark.catalog.clearCache()'? Thanks! 

jacovangelder
Honored Contributor

9/10 times GC is due to out of memory exceptions.

@Jaron spark.catalog.clearCache() is not a configurable option, but rather a command to submit.

Jaron
New Contributor III

Thanks! But I'm running the python script file via workflow→jobs, so I can't submit "spark.catalog.clearCache()" via notebooks because they're isolated. Is there any way out of this situation?😭.

For another question, may I ask if the ''memory'' you mentioned is spark.executor.memory? My program is running with 64GB of computer memory which is large enough, but still this GC issue occurs. I checked the docs and they all mention that it could be that the ''spark.executor.memory'' is too small, but I don't know how to check and deal with it. (so tired 😫

Looking forward your reply, thanks !!!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group