cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vamsivarun007
by New Contributor II
  • 34632 Views
  • 5 replies
  • 2 kudos

Driver is up but is not responsive, likely due to GC.

Hi all, "Driver is up but is not responsive, likely due to GC." This is the message in cluster event logs. Can anyone help me with this. What does GC means? Garbage collection? Can we control it externally?

  • 34632 Views
  • 5 replies
  • 2 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 2 kudos

9/10 times GC is due to out of memory exceptions.@Jaron spark.catalog.clearCache() is not a configurable option, but rather a command to submit.

  • 2 kudos
4 More Replies
sh_abrishami_ie
by New Contributor II
  • 4507 Views
  • 1 replies
  • 3 kudos

Resolved! Driver is up but is not responsive, likely due to GC.

Hi,I have a problem with writing an excel file into the mounted file.after 10 mins I see the Driver is up but is not responsive, likely due to GC on the log events.I'm using the following script:df.repartition(1).write .format("com.crealytics.spark....

  • 4507 Views
  • 1 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

It is not solution to that problem but I recommend to handle excel reads and writes with Spark Koalas https://koalas.readthedocs.io/en/latest/reference/api/databricks.koalas.DataFrame.to_excel.html just give it a try maybe it will solve your issue

  • 3 kudos
sarvesh
by Contributor III
  • 30565 Views
  • 18 replies
  • 6 kudos

Resolved! java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]

solution :- i don't need to add any executor or driver memory all i had to do in my case was add this : - option("maxRowsInMemory", 1000). Before i could n't even read a 9mb file now i just read a 50mb file without any error.{ val df = spark.read .f...

edit spark ui 2 edit spark ui 1
  • 30565 Views
  • 18 replies
  • 6 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 6 kudos

can you try without: .set("spark.driver.memory","4g") .set("spark.executor.memory", "6g")It is clearly show that there is no 4gb free on driver and 6gb free on executor (you can share hardware cluster details also).You can not also allocate 100% for ...

  • 6 kudos
17 More Replies
Labels