JVM Heap Memory Graph - more memory used than available
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2025 05:12 AM
I'm analyzing the memory usage of my Spark application and I see something strange when checking JVM Heap Memory Graph (see screenshot below). Each line on the graph is representing one executor.
Why the memory usage sometimes reaches over 10GB, when my spark.executor.memory is set to be 7GB? How is this possible that more memory is used than actually is available?
Thanks for your help!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-30-2025 12:03 AM
Hi @dbuserng ,
The memory usage in your Spark application can exceed the spark.executor.memory
setting of 7GB for several reasons:
• Off-Heap Memory Usage: Spark allows for off-heap memory allocation, which is not managed by the JVM garbage collector. This is controlled by the settings spark.memory.offHeap.enabled
and spark.memory.offHeap.size
. If off-heap memory is enabled, the total memory usage can exceed the JVM heap size allocated by spark.executor.memory
.
Check if off-heap memory is enabled and adjust the settings spark.memory.offHeap.enabled
and spark.memory.offHeap.size
accordingly.

