cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Spark out of memory error.You can resolve this error by increasing the size of cluster in Databricks.

SS2
Valued Contributor

Spark out of memory error.

You can resolve this error by increasing the size of cluster in Databricks.

4 REPLIES 4

karthik_p
Esteemed Contributor

@S Sโ€‹ every time cluster increase may not be good solution, based on scenarios it gets changed

  1. sometimes we may need to tweak code
  2. sometimes we may need to add memory parameters
  3. Based on ganglia metrics we can get more information

NhatHoang
Valued Contributor II

Hi guys,

I agreeโ€‹, it is better if you improve you code rather than increase the size of cluster. You can config the number of partitions.

Shalabh007
Honored Contributor

Directly jumping on solution to inc the cluster size is not advisible. I found this nicely written blog what could be potential reason and some initial steps to resolve the OOM error in spark.

https://medium.com/swlh/spark-oom-error-closeup-462c7a01709d

DK03
Contributor

Adding some more points to @karthik pโ€‹ 's answer.

  1. Use kryo serializer instead of java serializer.
  2. Use an optimised garbage collector such as G1GC.
  3. Use partitioning wisely on a field.
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.