<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ] in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34603#M25341</link>
    <description>&lt;P&gt;you don't do anything else but the spark.read.excel, right?&lt;/P&gt;</description>
    <pubDate>Tue, 23 Nov 2021 09:44:30 GMT</pubDate>
    <dc:creator>-werners-</dc:creator>
    <dc:date>2021-11-23T09:44:30Z</dc:date>
    <item>
      <title>java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34596#M25334</link>
      <description>&lt;P&gt;solution :-&lt;/P&gt;&lt;P&gt; i don't need to add any executor or driver memory all i had to do in my case was add this : -  option("maxRowsInMemory", 1000). &lt;/P&gt;&lt;P&gt;Before i could n't even read a 9mb file now i just read a 50mb file without any error.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;{&lt;/P&gt;&lt;P&gt; val &lt;I&gt;df &lt;/I&gt;= &lt;I&gt;spark&lt;/I&gt;.read&lt;/P&gt;&lt;P&gt; .format("com.crealytics.spark.excel").&lt;/P&gt;&lt;P&gt;  option("maxRowsInMemory", 1000).&lt;/P&gt;&lt;P&gt;   option("header", "true").&lt;/P&gt;&lt;P&gt;   load("data/12file.xlsx")&lt;/P&gt;&lt;P&gt; }&lt;/P&gt;&lt;P&gt;I am trying to read a 8mb excel file,&lt;/P&gt;&lt;P&gt;i am getting this error.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;i use intellij with spark 2.4.4&lt;/P&gt;&lt;P&gt;scala 2.12.12&lt;/P&gt;&lt;P&gt;and jdk 1.8&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;this is my code : -&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;val &lt;I&gt;conf &lt;/I&gt;= new SparkConf()&lt;/P&gt;&lt;P&gt; .set("spark.driver.memory","4g")&lt;/P&gt;&lt;P&gt; .set("spark.executor.memory", "6g")&lt;/P&gt;&lt;P&gt;// .set("spark.executor.cores", "2")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; val &lt;I&gt;spark &lt;/I&gt;= SparkSession&lt;/P&gt;&lt;P&gt; .&lt;I&gt;builder&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt; &lt;/I&gt;.appName("trimTest")&lt;/P&gt;&lt;P&gt; .master("local[*]")&lt;/P&gt;&lt;P&gt; .config(&lt;I&gt;conf&lt;/I&gt;)&lt;/P&gt;&lt;P&gt; .getOrCreate()&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; val &lt;I&gt;df &lt;/I&gt;= &lt;I&gt;spark&lt;/I&gt;.read&lt;/P&gt;&lt;P&gt; .format("com.crealytics.spark.excel").&lt;/P&gt;&lt;P&gt; option("header", "true").&lt;/P&gt;&lt;P&gt; load("data/12file.xlsx")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Now, these are my spark ui screenshots,&lt;/P&gt;&lt;P&gt;can you tell me what is the main issue and how can i increase the job executor memory.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="edit spark ui 2"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2304i2A7BE1AA8BD63C4C/image-size/large?v=v2&amp;amp;px=999" role="button" title="edit spark ui 2" alt="edit spark ui 2" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="edit spark ui 1"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2296i4D11185811B04A15/image-size/large?v=v2&amp;amp;px=999" role="button" title="edit spark ui 1" alt="edit spark ui 1" /&gt;&lt;/span&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;stack :-&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;java.lang.OutOfMemoryError: GC overhead limit exceeded&lt;/P&gt;&lt;P&gt;	at java.lang.Class.newReflectionData(Class.java:2511)&lt;/P&gt;&lt;P&gt;	at java.lang.Class.reflectionData(Class.java:2503)&lt;/P&gt;&lt;P&gt;	at java.lang.Class.privateGetDeclaredConstructors(Class.java:2660)&lt;/P&gt;&lt;P&gt;	at java.lang.Class.getConstructor0(Class.java:3075)&lt;/P&gt;&lt;P&gt;	at java.lang.Class.newInstance(Class.java:412)&lt;/P&gt;&lt;P&gt;	at sun.reflect.MethodAccessorGenerator$1.run(MethodAccessorGenerator.java:403)&lt;/P&gt;&lt;P&gt;	at sun.reflect.MethodAccessorGenerator$1.run(MethodAccessorGenerator.java:394)&lt;/P&gt;&lt;P&gt;	at java.security.AccessController.doPrivileged(Native Method)&lt;/P&gt;&lt;P&gt;	at sun.reflect.MethodAccessorGenerator.generate(MethodAccessorGenerator.java:393)&lt;/P&gt;&lt;P&gt;	at sun.reflect.MethodAccessorGenerator.generateMethod(MethodAccessorGenerator.java:75)&lt;/P&gt;&lt;P&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:53)&lt;/P&gt;&lt;P&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;/P&gt;&lt;P&gt;	at java.lang.reflect.Method.invoke(Method.java:498)&lt;/P&gt;&lt;P&gt;	at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:276)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:193)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.ConvertingMethod.invokeWithOpenReturn(ConvertingMethod.java:175)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:117)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.MXBeanIntrospector.invokeM2(MXBeanIntrospector.java:54)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.PerInterface.getAttribute(PerInterface.java:83)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.MBeanSupport.getAttribute(MBeanSupport.java:206)&lt;/P&gt;&lt;P&gt;	at javax.management.StandardMBean.getAttribute(StandardMBean.java:372)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getAttribute(DefaultMBeanServerInterceptor.java:647)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.JmxMBeanServer.getAttribute(JmxMBeanServer.java:678)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.MXBeanProxy$GetHandler.invoke(MXBeanProxy.java:122)&lt;/P&gt;&lt;P&gt;	at com.sun.jmx.mbeanserver.MXBeanProxy.invoke(MXBeanProxy.java:167)&lt;/P&gt;&lt;P&gt;	at javax.management.MBeanServerInvocationHandler.invoke(MBeanServerInvocationHandler.java:258)&lt;/P&gt;&lt;P&gt;	at com.sun.proxy.$Proxy8.getMemoryUsed(Unknown Source)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.metrics.MBeanExecutorMetricType.getMetricValue(ExecutorMetricType.scala:67)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.metrics.SingleValueExecutorMetricType.getMetricValues(ExecutorMetricType.scala:46)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.metrics.SingleValueExecutorMetricType.getMetricValues$(ExecutorMetricType.scala:44)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.metrics.MBeanExecutorMetricType.getMetricValues(ExecutorMetricType.scala:60)&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 05:51:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34596#M25334</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T05:51:42Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34597#M25335</link>
      <description>&lt;P&gt;I doubt it is the 8 MB file.&lt;/P&gt;&lt;P&gt;What happens if you do not set any memory parameter at all? (use the defaults)&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 07:05:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34597#M25335</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-23T07:05:56Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34598#M25336</link>
      <description>&lt;P&gt;it is an 8.5 mb xlsx file with 100k rows of data,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;i get the same gc overhead limit exceeded error without addin any parameter&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 07:40:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34598#M25336</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T07:40:04Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34599#M25337</link>
      <description>&lt;P&gt;My guess is indeed a config issue as in your spark script you don't seem to do any action (spark is lazy evaluated).&lt;/P&gt;&lt;P&gt;As you run Spark locally, chances are the JVM cannot allocate enough RAM for it to run succesfully.&lt;/P&gt;&lt;P&gt;Can you check the docs:&lt;/P&gt;&lt;P&gt;&lt;A href="https://spark.apache.org/docs/2.4.4/tuning.html#garbage-collection-tuning" alt="https://spark.apache.org/docs/2.4.4/tuning.html#garbage-collection-tuning" target="_blank"&gt;https://spark.apache.org/docs/2.4.4/tuning.html#garbage-collection-tuning&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 07:56:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34599#M25337</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-23T07:56:21Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34600#M25338</link>
      <description>&lt;P&gt;yes i just went through it, and from what i understood i need to increase the heap space, but increasing it at run time with intellij is not working.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 08:44:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34600#M25338</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T08:44:25Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34601#M25339</link>
      <description>&lt;P&gt;can you try with local[2] instead of local[*]?&lt;/P&gt;&lt;P&gt;And also beef up the driver memory to like 90% of your RAM.&lt;/P&gt;&lt;P&gt;As you run in local mode, the driver and the executor all run in the same process which is controlled by driver memory.&lt;/P&gt;&lt;P&gt;So you can skip the executor params.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 09:02:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34601#M25339</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-23T09:02:37Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34602#M25340</link>
      <description>&lt;P&gt;i did it is still the same, there is something else that i am missing here, and my memory consumed was 7gb out of 8gb available right now.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 09:38:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34602#M25340</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T09:38:14Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34603#M25341</link>
      <description>&lt;P&gt;you don't do anything else but the spark.read.excel, right?&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 09:44:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34603#M25341</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-23T09:44:30Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34604#M25342</link>
      <description>&lt;P&gt;i am doing df.show(),&lt;/P&gt;&lt;P&gt;nothing else&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 10:15:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34604#M25342</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T10:15:43Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34605#M25343</link>
      <description>&lt;P&gt;weird, when I run spark locally, I just install it, do not configure any executor and it just works.&lt;/P&gt;&lt;P&gt;Did you define any executors by any chance?&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 10:18:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34605#M25343</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-23T10:18:58Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34606#M25344</link>
      <description>&lt;P&gt;no i did n't i have also just installed it, i think this is my machines issue or something which i have not done right.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 10:49:26 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34606#M25344</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T10:49:26Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34607#M25345</link>
      <description>&lt;P&gt;can you try without:&lt;/P&gt;&lt;P&gt; .set("spark.driver.memory","4g")&lt;/P&gt;&lt;P&gt; .set("spark.executor.memory", "6g")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;It is clearly show that there is no 4gb free on driver and 6gb free on executor (you can share hardware cluster details also).&lt;/P&gt;&lt;P&gt;You can not also allocate 100% for spark usually as there is also other processes.&lt;/P&gt;&lt;P&gt;Automatic settings are recommended.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 11:07:12 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34607#M25345</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-11-23T11:07:12Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34608#M25346</link>
      <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="Screenshot from 2021-11-23 17-30-32"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2298iEB69220545CB074D/image-size/large?v=v2&amp;amp;px=999" role="button" title="Screenshot from 2021-11-23 17-30-32" alt="Screenshot from 2021-11-23 17-30-32" /&gt;&lt;/span&gt;i tried to read it without these configs i got the same error ( gc overhead limit ), and i am running it locally these are my specifications.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 12:00:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34608#M25346</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-23T12:00:18Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34609#M25347</link>
      <description>&lt;P&gt;It seems that you have only 8GB ram (probably 4-6 GB is needed for system at least) but you allocate 10GB for spark (4 GB driver + 6 GB executor).&lt;/P&gt;&lt;P&gt;You can allocate max in my opinion 2GB all together if your RAM is 8 GB. Maybe even 1GB as there can be also spikes in system processes.&lt;/P&gt;&lt;P&gt;Easier would be with docker as than you allocate your machine permanent number of ram and than spark can consume exact amount.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 12:50:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34609#M25347</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-11-23T12:50:54Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34610#M25348</link>
      <description>&lt;P&gt;Yea that will be it.  8GB can run spark, but I'd go for no more than 3GB, 2GB on the safe side.&lt;/P&gt;&lt;P&gt;It looks like an ubuntu install so that is not as resource hungry than windows but 8GB is not much.&lt;/P&gt;&lt;P&gt;For tinkering around, I always go for Docker (or a VM).&lt;/P&gt;</description>
      <pubDate>Tue, 23 Nov 2021 13:28:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34610#M25348</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-11-23T13:28:10Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34611#M25349</link>
      <description>&lt;P&gt;Thank You i just found a solution, and i have mentioned it in my question to, while reading my file all i had to do was add this,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;option("maxRowsInMemory", 1000).&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Nov 2021 11:12:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34611#M25349</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-25T11:12:58Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34612#M25350</link>
      <description>&lt;P&gt;found a way&lt;/P&gt;</description>
      <pubDate>Thu, 25 Nov 2021 11:13:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34612#M25350</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-11-25T11:13:11Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34613#M25351</link>
      <description>&lt;P&gt;Hi @sarvesh singh​&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Could you mark the "best" solution to your question please? it will help in case other community members have the se issue in the future. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thank you&lt;/P&gt;</description>
      <pubDate>Mon, 29 Nov 2021 19:07:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34613#M25351</guid>
      <dc:creator>jose_gonzalez</dc:creator>
      <dc:date>2021-11-29T19:07:30Z</dc:date>
    </item>
    <item>
      <title>Re: java.lang.OutOfMemoryError: GC overhead limit exceeded. [ solved ]</title>
      <link>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34614#M25352</link>
      <description>&lt;P&gt;done&lt;/P&gt;</description>
      <pubDate>Wed, 01 Dec 2021 13:12:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-lang-outofmemoryerror-gc-overhead-limit-exceeded-solved/m-p/34614#M25352</guid>
      <dc:creator>sarvesh</dc:creator>
      <dc:date>2021-12-01T13:12:05Z</dc:date>
    </item>
  </channel>
</rss>

