Relevance of off heap memory and usage
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-18-2025 07:57 PM
I was referring to the doc - https://kb.databricks.com/clusters/spark-executor-memory.
In general total off heap memory is = spark.executor.memoryOverhead + spark.offHeap.size. The off-heap mode is controlled by the properties spark.memory.offHeap.enabled.
Could you please clarify :
- difference between spark.executor.memoryOverhead vs spark.offHeap.size ? when to use one over other?
- In what use-cases/scenarios/operations Spark needs offheap memory
- when we set spark.memory.offHeap.enabled to false, does it disables only 'spark.offHeap.size' or both spark.executor.memoryOverhead and spark.offHeap.size?
0 REPLIES 0

