<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Older Spark Version loaded into the spark notebook in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12825#M7583</link>
    <description>&lt;P&gt;If you use pool please check what preloaded version is set also in pool.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If it is not that problem I can not help as I even don't see yet 10.0 (and after all it is Beta)&lt;/P&gt;</description>
    <pubDate>Wed, 20 Oct 2021 16:11:02 GMT</pubDate>
    <dc:creator>Hubert-Dudek</dc:creator>
    <dc:date>2021-10-20T16:11:02Z</dc:date>
    <item>
      <title>Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12824#M7582</link>
      <description>&lt;P&gt;I have databricks runtime for a job set to latest 10.0 Beta (includes Apache Spark 3.2.0, Scala 2.12) .&lt;/P&gt;&lt;P&gt;In the notebook when I check for the spark version, I see version 3.1.0 instead of version 3.2.0&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I need the Spark version 3.2 to process workloads as that version has the fix for &lt;A href="https://github.com/apache/spark/pull/32788" target="test_blank"&gt;https://github.com/apache/spark/pull/32788&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Screenshot with the cluster configuration and the older spark version in notebook attached.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="Screen Shot 2021-10-20 at 11.45.10 AM"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2390iB657A24A23AEE5F6/image-size/large?v=v2&amp;amp;px=999" role="button" title="Screen Shot 2021-10-20 at 11.45.10 AM" alt="Screen Shot 2021-10-20 at 11.45.10 AM" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 15:47:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12824#M7582</guid>
      <dc:creator>dbu_spark</dc:creator>
      <dc:date>2021-10-20T15:47:54Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12825#M7583</link>
      <description>&lt;P&gt;If you use pool please check what preloaded version is set also in pool.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If it is not that problem I can not help as I even don't see yet 10.0 (and after all it is Beta)&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 16:11:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12825#M7583</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2021-10-20T16:11:02Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12826#M7584</link>
      <description>&lt;P&gt;I got the same thing when I tested it out.  I guess that's why it's Beta, should get fixed soon I imagine.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 16:27:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12826#M7584</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2021-10-20T16:27:56Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12827#M7585</link>
      <description>&lt;P&gt;I am not using the pool. Thanks for the update though. Hopefully this gets fixed soon.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 16:54:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12827#M7585</guid>
      <dc:creator>dbu_spark</dc:creator>
      <dc:date>2021-10-20T16:54:41Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12828#M7586</link>
      <description>&lt;P&gt;This is due to some legalese related to Open-Source Spark and Databricks' Spark. Since Open-Source Spark has not released v3.2 yet, we are not allowed to call the one on DBR 10 v3.2 yet. It may have that patch already in there though.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 18:17:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12828#M7586</guid>
      <dc:creator>Dan_Z</dc:creator>
      <dc:date>2021-10-20T18:17:27Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12829#M7587</link>
      <description>&lt;P&gt;Thanks for the update @Dan Zafar​&amp;nbsp;. &lt;/P&gt;&lt;P&gt;Just ran the job again and still seeing spark version 3.1.0. Should I be using spark32 (or something similar) when invoking spark session for me to pick up the correct spark version?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Any ETA on the spark 3.2 version availability will be great. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 18:45:26 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12829#M7587</guid>
      <dc:creator>dbu_spark</dc:creator>
      <dc:date>2021-10-20T18:45:26Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12830#M7588</link>
      <description>&lt;P&gt;It should have all the features you need. Check it out. Legally we can't call it Spark 3.2 yet.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 19:11:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12830#M7588</guid>
      <dc:creator>Dan_Z</dc:creator>
      <dc:date>2021-10-20T19:11:51Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12831#M7589</link>
      <description>&lt;P&gt;I do not think it is loading Spark 3.2. I am still seeing the issue with writeUTF which has been fixed in Spark 3.2 -&amp;gt; &lt;A href="https://github.com/apache/spark/pull/32788" alt="https://github.com/apache/spark/pull/32788" target="_blank"&gt;https://github.com/apache/spark/pull/32788&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;Caused by: java.io.UTFDataFormatException: encoded string too long: 97548 bytes
	at java.io.DataOutputStream.writeUTF(DataOutputStream.java:364)
	at java.io.DataOutputStream.writeUTF(DataOutputStream.java:323)&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Anyways, I will wait for the databricks runtime to correctly reflect the correct version.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 19:25:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12831#M7589</guid>
      <dc:creator>dbu_spark</dc:creator>
      <dc:date>2021-10-20T19:25:10Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12832#M7590</link>
      <description>&lt;P&gt;Yes- this version probably has the Databricks internal features slated for Spark 3.2, but the features/patches contributed by the open-source community may still be coming. Sorry this isn't available yet. I'm sure it will be very soon. Happy coding! &lt;/P&gt;</description>
      <pubDate>Wed, 20 Oct 2021 21:44:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12832#M7590</guid>
      <dc:creator>Dan_Z</dc:creator>
      <dc:date>2021-10-20T21:44:07Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12833#M7591</link>
      <description>&lt;P&gt;I just noticed that (on Azure anyway) 10.0 is NOT in beta anymore.&lt;/P&gt;&lt;P&gt;So 'very soon' was indeed very soon.&lt;/P&gt;</description>
      <pubDate>Thu, 21 Oct 2021 12:32:23 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12833#M7591</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2021-10-21T12:32:23Z</dc:date>
    </item>
    <item>
      <title>Re: Older Spark Version loaded into the spark notebook</title>
      <link>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12834#M7592</link>
      <description>&lt;P&gt;hi @Dhaivat Upadhyay​&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Good news, DBR 10 was release yesterday October 20th. You can find more details in the release notes &lt;A href="https://docs.databricks.com/release-notes/runtime/releases.html#supported-databricks-runtime-releases-and-support-schedule" alt="https://docs.databricks.com/release-notes/runtime/releases.html#supported-databricks-runtime-releases-and-support-schedule" target="_blank"&gt;website&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Oct 2021 16:47:45 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/older-spark-version-loaded-into-the-spark-notebook/m-p/12834#M7592</guid>
      <dc:creator>jose_gonzalez</dc:creator>
      <dc:date>2021-10-21T16:47:45Z</dc:date>
    </item>
  </channel>
</rss>

