- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 12:58 AM
I am trying to read a 16mb excel file and I was getting a gc overhead limit exceeded error to resolve that i tried to increase my executor memory with,
spark.conf.set("spark.executor.memory", "8g")
but i got the following stack :
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" org.apache.spark.sql.AnalysisException: Cannot modify the value of a Spark config: spark.executor.memory;
at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:158)
at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:42)
at com.sundogsoftware.spark.spaceTrim.trimmer$.delayedEndpoint$com$sundogsoftware$spark$spaceTrim$trimmer$1(trimmer.scala:29)
at com.sundogsoftware.spark.spaceTrim.trimmer$delayedInit$body.apply(trimmer.scala:9)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:431)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at com.sundogsoftware.spark.spaceTrim.trimmer$.main(trimmer.scala:9)
at com.sundogsoftware.spark.spaceTrim.trimmer.main(trimmer.scala)
my code :-
val spark = SparkSession
.builder
.appName("schemaTest")
.master("local[*]")
.getOrCreate()
spark.conf.set("spark.executor.memory", "8g")
val df = spark.read
.format("com.crealytics.spark.excel").
option("header", "true").
option("inferSchema", "false").
option("treatEmptyValuesAsNulls", "false").
option("addColorColumns", "False").
load("data/12file.xlsx")
- Labels:
-
Exception
-
Executor Memory
-
Spark config
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 01:24 AM
On the cluster configuration page, go to the advanced options. Click it to expand the field. There you will find the Spark tab and you can set the values there in the "Spark config".
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 01:16 AM
Hi @sarvesh singh Please try setting the value in the cluster spark config tab. It should help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 01:24 AM
On the cluster configuration page, go to the advanced options. Click it to expand the field. There you will find the Spark tab and you can set the values there in the "Spark config".
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 01:50 AM
Thank you for replying but, for this project i am using intellij and working locally, is there some way with spark session or context to do the same?
![](/skins/images/8C2A30E5B696B676846234E4B14F2C7B/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/8C2A30E5B696B676846234E4B14F2C7B/responsive_peak/images/icon_anonymous_message.png)