cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Spark out of memory error.You can resolve this error by increasing the size of cluster in Databricks.

SS2
Valued Contributor

Spark out of memory error.

You can resolve this error by increasing the size of cluster in Databricks.

4 REPLIES 4

karthik_p
Esteemed Contributor

@S Sโ€‹ every time cluster increase may not be good solution, based on scenarios it gets changed

  1. sometimes we may need to tweak code
  2. sometimes we may need to add memory parameters
  3. Based on ganglia metrics we can get more information

NhatHoang
Valued Contributor II

Hi guys,

I agreeโ€‹, it is better if you improve you code rather than increase the size of cluster. You can config the number of partitions.

Shalabh007
Honored Contributor

Directly jumping on solution to inc the cluster size is not advisible. I found this nicely written blog what could be potential reason and some initial steps to resolve the OOM error in spark.

https://medium.com/swlh/spark-oom-error-closeup-462c7a01709d

DK03
Contributor

Adding some more points to @karthik pโ€‹ 's answer.

  1. Use kryo serializer instead of java serializer.
  2. Use an optimised garbage collector such as G1GC.
  3. Use partitioning wisely on a field.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group