<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How can I automatically capture the heap dump on the driver and executors in the event of an OOM error? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25781#M17987</link>
    <description>&lt;P&gt;If you have a job that repeatedly run into Out-of-memory error (OOM) either on the driver or executors, automatically capture the heap dump on OOM event will help debugging the memory issue and identify the cause of the error.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Spark config:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.executor.extraJavaOptions -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dbfs/cluster-logs/heap-dumps/
&amp;nbsp;
spark.driver.extraJavaOptions -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dbfs/cluster-logs/heap-dumps/&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Environment variables:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;DBFS_FUSE_VERSION=1&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 07 Jun 2021 22:02:32 GMT</pubDate>
    <dc:creator>User16752245312</dc:creator>
    <dc:date>2021-06-07T22:02:32Z</dc:date>
    <item>
      <title>How can I automatically capture the heap dump on the driver and executors in the event of an OOM error?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25781#M17987</link>
      <description>&lt;P&gt;If you have a job that repeatedly run into Out-of-memory error (OOM) either on the driver or executors, automatically capture the heap dump on OOM event will help debugging the memory issue and identify the cause of the error.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Spark config:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.executor.extraJavaOptions -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dbfs/cluster-logs/heap-dumps/
&amp;nbsp;
spark.driver.extraJavaOptions -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dbfs/cluster-logs/heap-dumps/&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Environment variables:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;DBFS_FUSE_VERSION=1&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 07 Jun 2021 22:02:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25781#M17987</guid>
      <dc:creator>User16752245312</dc:creator>
      <dc:date>2021-06-07T22:02:32Z</dc:date>
    </item>
    <item>
      <title>Re: How can I automatically capture the heap dump on the driver and executors in the event of an OOM error?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25782#M17988</link>
      <description>&lt;P&gt;Is it necessary to use exactly that HeapDumpPath? I find I'm unable to get driver heap dumps with a different path but otherwise the same configuration. I'm using spark_version 10.4.x-cpu-ml-scala2.12.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 09 Aug 2022 22:16:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25782#M17988</guid>
      <dc:creator>John_360</dc:creator>
      <dc:date>2022-08-09T22:16:03Z</dc:date>
    </item>
    <item>
      <title>Re: How can I automatically capture the heap dump on the driver and executors in the event of an OOM error?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25783#M17989</link>
      <description>&lt;P&gt;It's not necessary has to be exact as in the example, you can use any path.  But make sure the path points to a directory that already exists.  Also note that the path is on DBFS.  We want a location where both the driver and executors can write to.&lt;/P&gt;</description>
      <pubDate>Thu, 18 Aug 2022 21:07:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-automatically-capture-the-heap-dump-on-the-driver-and/m-p/25783#M17989</guid>
      <dc:creator>User16752245312</dc:creator>
      <dc:date>2022-08-18T21:07:43Z</dc:date>
    </item>
  </channel>
</rss>

