Hi @dbuserng ,
The memory usage in your Spark application can exceed the spark.executor.memory
setting of 7GB for several reasons:
โข Off-Heap Memory Usage: Spark allows for off-heap memory allocation, which is not managed by the JVM garbage collector. This is controlled by the settings spark.memory.offHeap.enabled
and spark.memory.offHeap.size
. If off-heap memory is enabled, the total memory usage can exceed the JVM heap size allocated by spark.executor.memory
.
Check if off-heap memory is enabled and adjust the settings spark.memory.offHeap.enabled
and spark.memory.offHeap.size
accordingly.