<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Hive version after Upgrade Azure Databricks from 6.4 (Spark 2) to 9.1 (Spark 3) in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32880#M23996</link>
    <description>&lt;P&gt;I have upgraded the Azure Databricks from 6.4 to 9.1 which enable me to use Spark3. As far as I know, the Hive version has to be upgraded to 2.3.7 as well as discussed in: &lt;/P&gt;&lt;P&gt;&lt;A href="https://community.databricks.com/s/question/0D53f00001HKHy2CAH/how-to-upgrade-internal-hive-metadata-store-version" target="test_blank"&gt;https://community.databricks.com/s/question/0D53f00001HKHy2CAH/how-to-upgrade-internal-hive-metadata-store-version&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have tried not given those options and continue using Hive 0.13 to run my Spark3 application and everything seems fine. Is it a must to upgrade the Hive metastore to 2.3.7 for Spark3 program to run? Cause everything seems find without adding:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.sql.hive.metastore.version 2.3.7
spark.sql.hive.metastore.jars builtin&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 16 Dec 2021 21:55:05 GMT</pubDate>
    <dc:creator>jeffreym9</dc:creator>
    <dc:date>2021-12-16T21:55:05Z</dc:date>
    <item>
      <title>Hive version after Upgrade Azure Databricks from 6.4 (Spark 2) to 9.1 (Spark 3)</title>
      <link>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32880#M23996</link>
      <description>&lt;P&gt;I have upgraded the Azure Databricks from 6.4 to 9.1 which enable me to use Spark3. As far as I know, the Hive version has to be upgraded to 2.3.7 as well as discussed in: &lt;/P&gt;&lt;P&gt;&lt;A href="https://community.databricks.com/s/question/0D53f00001HKHy2CAH/how-to-upgrade-internal-hive-metadata-store-version" target="test_blank"&gt;https://community.databricks.com/s/question/0D53f00001HKHy2CAH/how-to-upgrade-internal-hive-metadata-store-version&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have tried not given those options and continue using Hive 0.13 to run my Spark3 application and everything seems fine. Is it a must to upgrade the Hive metastore to 2.3.7 for Spark3 program to run? Cause everything seems find without adding:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.sql.hive.metastore.version 2.3.7
spark.sql.hive.metastore.jars builtin&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 16 Dec 2021 21:55:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32880#M23996</guid>
      <dc:creator>jeffreym9</dc:creator>
      <dc:date>2021-12-16T21:55:05Z</dc:date>
    </item>
    <item>
      <title>Re: Hive version after Upgrade Azure Databricks from 6.4 (Spark 2) to 9.1 (Spark 3)</title>
      <link>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32882#M23998</link>
      <description>&lt;P&gt;@Jeffrey Mak​&amp;nbsp; all supported version is mention here &lt;A href="https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore" target="test_blank"&gt;https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore&lt;/A&gt; also, I do not think builtin will work.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;For all other Hive versions, Azure Databricks recommends that you download the metastore JARs and set the configuration&amp;nbsp;&lt;/LI&gt;&lt;LI&gt;spark.sql.hive.metastore.jars&lt;/LI&gt;&lt;LI&gt;&amp;nbsp;&lt;/LI&gt;&lt;LI&gt;&amp;nbsp;to point to the downloaded JARs using the procedure described in&amp;nbsp;&lt;A href="https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore#download-the-metastore-jars-and-point-to-them" alt="https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore#download-the-metastore-jars-and-point-to-them" target="_blank"&gt;Download the metastore jars and point to them&lt;/A&gt;.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 12 Jan 2022 02:55:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32882#M23998</guid>
      <dc:creator>Atanu</dc:creator>
      <dc:date>2022-01-12T02:55:03Z</dc:date>
    </item>
    <item>
      <title>Re: Hive version after Upgrade Azure Databricks from 6.4 (Spark 2) to 9.1 (Spark 3)</title>
      <link>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32883#M23999</link>
      <description>&lt;P&gt;@Jeffrey Mak​&amp;nbsp;- Does Atanu's answer resolve the issue for you?  If yes, would you be happy to mark it as best so that other members can find the solution more quickly?&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jan 2022 16:19:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32883#M23999</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2022-01-26T16:19:55Z</dc:date>
    </item>
    <item>
      <title>Re: Hive version after Upgrade Azure Databricks from 6.4 (Spark 2) to 9.1 (Spark 3)</title>
      <link>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32884#M24000</link>
      <description>&lt;P&gt;I'm asking about Datatricks version 9.1. I've follow the url given (&lt;A href="https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore" alt="https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore" target="_blank"&gt;https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore&lt;/A&gt;). Do you mind letting me know where in the table is mentioning the supported hive version for Databricks 9.1? &lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="Screen Shot 2022-01-26 at 2.06.15 PM"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2227i4EFA7A796C63BE33/image-size/large?v=v2&amp;amp;px=999" role="button" title="Screen Shot 2022-01-26 at 2.06.15 PM" alt="Screen Shot 2022-01-26 at 2.06.15 PM" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jan 2022 19:09:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32884#M24000</guid>
      <dc:creator>jeffreym9</dc:creator>
      <dc:date>2022-01-26T19:09:04Z</dc:date>
    </item>
    <item>
      <title>Re: Hive version after Upgrade Azure Databricks from 6.4 (Spark 2) to 9.1 (Spark 3)</title>
      <link>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32885#M24001</link>
      <description>&lt;P&gt;@Jeffrey Mak​&amp;nbsp; the doc is not updated yet , may be team is working on it . you can consider 7.x+ for 9 DBR. Please let us know how that goes. Thanks.&lt;/P&gt;</description>
      <pubDate>Sat, 12 Feb 2022 16:30:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/hive-version-after-upgrade-azure-databricks-from-6-4-spark-2-to/m-p/32885#M24001</guid>
      <dc:creator>Atanu</dc:creator>
      <dc:date>2022-02-12T16:30:35Z</dc:date>
    </item>
  </channel>
</rss>

