<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: The spark context has stopped and the driver is restarting. Your notebook will be automatically in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/64215#M32499</link>
    <description>&lt;P&gt;Could you share the code you have in your JAR file? how are you creating your Spark context in your JAR file?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 20 Mar 2024 21:32:37 GMT</pubDate>
    <dc:creator>jose_gonzalez</dc:creator>
    <dc:date>2024-03-20T21:32:37Z</dc:date>
    <item>
      <title>The spark context has stopped and the driver is restarting. Your notebook will be automatically</title>
      <link>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/62998#M32147</link>
      <description>&lt;P&gt;I am trying to execute a scala jar in notebook. When I execute it explicitly I am able to run the jar like this :&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sachin__1-1709881658170.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/6564iAF72549F70E465F3/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="Sachin__1-1709881658170.png" alt="Sachin__1-1709881658170.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sachin__2-1709881677411.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/6565iB89D329CAC2417BF/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="Sachin__2-1709881677411.png" alt="Sachin__2-1709881677411.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;but when I am trying to run a notebook through databricks workflow I get the below &lt;STRONG&gt;error&lt;/STRONG&gt; :&amp;nbsp;&lt;STRONG&gt;The spark context has stopped and the driver is restarting. Your notebook will be automatically reattached.&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Steps I have taken till now :&amp;nbsp;&lt;/P&gt;&lt;P&gt;tried increasing spark driver memory like this :&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sachin__3-1709881874830.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/6566iA3C9BF6D7E98861F/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="Sachin__3-1709881874830.png" alt="Sachin__3-1709881874830.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 08 Mar 2024 07:13:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/62998#M32147</guid>
      <dc:creator>Sachin_</dc:creator>
      <dc:date>2024-03-08T07:13:07Z</dc:date>
    </item>
    <item>
      <title>Re: The spark context has stopped and the driver is restarting. Your notebook will be automatically</title>
      <link>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/63951#M32412</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/78314"&gt;@Kani&lt;/a&gt;&amp;nbsp;! Thanks for your time. We have certain practices followed in our organisation to trigger the Jar which is by using py4j. So, I tried getting the logs which I was not getting earlier. The error is as below :&amp;nbsp;&lt;/P&gt;&lt;P&gt;'{"error_code": 1, "error_message": "py4j does not exist in the JVM -- org.apache.spark.SparkException: Trying to putInheritedProperty with no active spark context\n\tat org.apache.spark.credentials.CredentialContext$.$anonfun$putInheritedProperty$2(CredentialContext.scala:188)\n\tat scala.Option.getOrElse(Option.scala:189)\n\tat org.apache.spark.credentials.CredentialContext$.$anonfun$putInheritedProperty$1(CredentialContext.scala:188)\n\tat scala.Option.getOrElse(Option.scala:189)\n\tat org.apache.spark.credentials.CredentialContext$.putInheritedProperty(CredentialContext.scala:187)\n\tat com.databricks.backend.daemon.driver.SparkThreadLocalUtils$$anon$1.$anonfun$run$2(SparkThreadLocalUtils.scala:56)\n\tat com.databricks.backend.daemon.driver.SparkThreadLocalUtils$$anon$1.$anonfun$run$2$adapted(SparkThreadLocalUtils.scala:56)\n\tat scala.Option.foreach(Option.scala:407)\n\tat com.databricks.backend.daemon.driver.SparkThreadLocalUtils$$anon$1.run(SparkThreadLocalUtils.scala:56)\n\tat java.lang.Iterable.forEach(Iterable.java:75)\n\tat py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:194)\n\tat py4j.ClientServerConnection.run(ClientServerConnection.java:115)\n\tat java.lang.Thread.run(Thread.java:750)\n"}', -- NEW Internal data field&lt;BR /&gt;'1', -- NEW error code field&lt;BR /&gt;NULL -- NEW Write ID field&lt;BR /&gt;-- Forward compatibility padding&lt;BR /&gt;)&lt;BR /&gt;2024-03-15 10:29:49,643:offer_toolbox.retry:DEBUG:Last attempt for 'robust_execute'&lt;BR /&gt;2024-03-15 10:29:49,644:offer_metadata.v1.store:DEBUG:Insert event from Step v1&lt;BR /&gt;2024-03-15 10:29:49,645:offer_metadata.sql.engine:DEBUG:End cursor self._connection_count=1 CALLED:keep_alive=False FIXED:self.keep_alive=True&lt;BR /&gt;2024-03-15 10:29:49,646:offer_metadata.v1.operational:INFO:[AAM_Demo][demo_action] END Step&lt;BR /&gt;2024-03-15 10:29:50,711:offer_companion.companion:WARNING:Intercept IPython error py4j does not exist in the JVM -- org.apache.spark.SparkException: Trying to putInheritedProperty with no active spark context&lt;BR /&gt;at org.apache.spark.credentials.CredentialContext$.$anonfun$putInheritedProperty$2(CredentialContext.scala:188)&lt;BR /&gt;at scala.Option.getOrElse(Option.scala:189)&lt;BR /&gt;at org.apache.spark.credentials.CredentialContext$.$anonfun$putInheritedProperty$1(CredentialContext.scala:188)&lt;BR /&gt;at scala.Option.getOrElse(Option.scala:189)&lt;BR /&gt;at org.apache.spark.credentials.CredentialContext$.putInheritedProperty(CredentialContext.scala:187)&lt;BR /&gt;at com.databricks.backend.daemon.driver.SparkThreadLocalUtils$$anon$1.$anonfun$run$2(SparkThreadLocalUtils.scala:56)&lt;BR /&gt;at com.databricks.backend.daemon.driver.SparkThreadLocalUtils$$anon$1.$anonfun$run$2$adapted(SparkThreadLocalUtils.scala:56)&lt;BR /&gt;at scala.Option.foreach(Option.scala:407)&lt;BR /&gt;at com.databricks.backend.daemon.driver.SparkThreadLocalUtils$$anon$1.run(SparkThreadLocalUtils.scala:56)&lt;BR /&gt;at java.lang.Iterable.forEach(Iterable.java:75)&lt;BR /&gt;at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:194)&lt;BR /&gt;at py4j.ClientServerConnection.run(ClientServerConnection.java:115)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:750)&lt;/P&gt;&lt;P&gt;Any idea on how I can initialize the cluster with py4j?&lt;/P&gt;</description>
      <pubDate>Mon, 18 Mar 2024 05:59:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/63951#M32412</guid>
      <dc:creator>Sachin_</dc:creator>
      <dc:date>2024-03-18T05:59:47Z</dc:date>
    </item>
    <item>
      <title>Re: The spark context has stopped and the driver is restarting. Your notebook will be automatically</title>
      <link>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/64215#M32499</link>
      <description>&lt;P&gt;Could you share the code you have in your JAR file? how are you creating your Spark context in your JAR file?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 20 Mar 2024 21:32:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/64215#M32499</guid>
      <dc:creator>jose_gonzalez</dc:creator>
      <dc:date>2024-03-20T21:32:37Z</dc:date>
    </item>
    <item>
      <title>Re: The spark context has stopped and the driver is restarting. Your notebook will be automatically</title>
      <link>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/65494#M32825</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/29880"&gt;@jose_gonzalez&lt;/a&gt;&amp;nbsp;and&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/78314"&gt;@Kani&lt;/a&gt;&amp;nbsp;! Apologies foe the late reply. This is how we initialize spark session in ETL&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Sachin__0-1712220666036.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/6914i509FCE2E2E101223/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="Sachin__0-1712220666036.png" alt="Sachin__0-1712220666036.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 04 Apr 2024 08:51:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/the-spark-context-has-stopped-and-the-driver-is-restarting-your/m-p/65494#M32825</guid>
      <dc:creator>Sachin_</dc:creator>
      <dc:date>2024-04-04T08:51:35Z</dc:date>
    </item>
  </channel>
</rss>

