java.lang.OutOfMemoryError: GC overhead limit exceeded
I get java.lang.OutOfMemoryError: GC overhead limit exceeded when trying coutn action on a file. The file is a CSV file 217GB zise Im using a 10 r3.8xlarge(ubuntu) machines cdh 5.3.6 and spark 1.2.0 configutation: spark.app.id:local-1443956477103 s...
- 6386 Views
- 1 replies
- 0 kudos
Latest Reply
Looks like the following property is pretty high, which consumes a lot of memory on your executors when you cache the dataset. "spark.storage.memoryFraction:0.9" This could likely be solved by changing the configuration. Take a look at the upstream...
- 0 kudos