Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Hi all,
"Driver is up but is not responsive, likely due to GC."
This is the message in cluster event logs. Can anyone help me with this. What does GC means? Garbage collection? Can we control it externally?
Hi,I have a problem with writing an excel file into the mounted file.after 10 mins I see the Driver is up but is not responsive, likely due to GC on the log events.I'm using the following script:df.repartition(1).write .format("com.crealytics.spark....
It is not solution to that problem but I recommend to handle excel reads and writes with Spark Koalas https://koalas.readthedocs.io/en/latest/reference/api/databricks.koalas.DataFrame.to_excel.html just give it a try maybe it will solve your issue
solution :- i don't need to add any executor or driver memory all i had to do in my case was add this : - option("maxRowsInMemory", 1000). Before i could n't even read a 9mb file now i just read a 50mb file without any error.{ val df = spark.read .f...
can you try without: .set("spark.driver.memory","4g") .set("spark.executor.memory", "6g")It is clearly show that there is no 4gb free on driver and 6gb free on executor (you can share hardware cluster details also).You can not also allocate 100% for ...