<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Cannot Get Databricks SQL to read external Hive Metastore in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32712#M23852</link>
    <description>&lt;P&gt;@Bilal Aslam​&amp;nbsp; I didn't think to look there before since I hadn't tried to run any queries.  I see the failed SHOW DATABASES queries in history and they identify the error: &lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;Builtin jars can only be used when hive execution version == hive metastore version. Execution: 2.3.9 != Metastore: 2.3.7. Specify a valid path to the correct hive jars using spark.sql.hive.metastore.jars or change spark.sql.hive.metastore.version to 2.3.9.&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;My Data Engineering clusters are running the 9.1 LTS runtime and it looks like SQL is running 10.0.x-photon-scala2.12. I updated my SQL Endpoint spark.sql.hive.metastore.version setting to 2.3.9 which fixed the issue. Thank you!&lt;/P&gt;</description>
    <pubDate>Mon, 27 Dec 2021 15:30:07 GMT</pubDate>
    <dc:creator>TimK</dc:creator>
    <dc:date>2021-12-27T15:30:07Z</dc:date>
    <item>
      <title>Cannot Get Databricks SQL to read external Hive Metastore</title>
      <link>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32709#M23849</link>
      <description>&lt;P&gt;I have followed the documentation and using the same metastore config that is working in the Data Engineering context. When attempting to view the Databases, I get the error:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Encountered an internal error&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;The following information failed to load:&lt;/I&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;I&gt;The list of databases in hive_metastore catalog&lt;/I&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;I&gt;Please&amp;nbsp;try again&amp;nbsp;or contact your Databricks representative if the issue persists.&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;My SQL Endpoint config is:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.hadoop.javax.jdo.option.ConnectionURL {{secrets/key-vault-secrets/Metastore-ConnectionURL}}
spark.hadoop.javax.jdo.option.ConnectionUserName {{secrets/key-vault-secrets/Metastore-ConnectionUserName}}
spark.hadoop.javax.jdo.option.ConnectionPassword {{secrets/key-vault-secrets/Metastore-ConnectionPassword}}
spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver
spark.sql.hive.metastore.version {{secrets/key-vault-secrets/Metastore-Version}}
spark.sql.hive.metastore.jars {{secrets/key-vault-secrets/Metastore-Jars}}
spark.hadoop.fs.azure.account.auth.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net OAuth
spark.hadoop.fs.azure.account.oauth.provider.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark.hadoop.fs.azure.account.oauth2.client.id.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-ServiceAccount-SQLDataAccess}}
spark.hadoop.fs.azure.account.oauth2.client.secret.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-SQLDataAccess-Secret}}
spark.hadoop.fs.azure.account.oauth2.client.endpoint.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net &lt;A href="https://login.microsoftonline.com/{{secrets/key-vault-secrets/Tenant-Id}}/oauth2/token" target="test_blank"&gt;https://login.microsoftonline.com/{{secrets/key-vault-secrets/Tenant-Id}}/oauth2/token&lt;/A&gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Dec 2021 20:07:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32709#M23849</guid>
      <dc:creator>TimK</dc:creator>
      <dc:date>2021-12-20T20:07:04Z</dc:date>
    </item>
    <item>
      <title>Re: Cannot Get Databricks SQL to read external Hive Metastore</title>
      <link>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32711#M23851</link>
      <description>&lt;P&gt;@Tim Kracht​&amp;nbsp; this shouldn't be happening. Go to Query History, pick a query, go to details, then environment and look for&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;spark.databricks.clusterUsageTags.sparkVersion&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;What does this say?&lt;/P&gt;</description>
      <pubDate>Sun, 26 Dec 2021 21:13:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32711#M23851</guid>
      <dc:creator>BilalAslamDbrx</dc:creator>
      <dc:date>2021-12-26T21:13:22Z</dc:date>
    </item>
    <item>
      <title>Re: Cannot Get Databricks SQL to read external Hive Metastore</title>
      <link>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32712#M23852</link>
      <description>&lt;P&gt;@Bilal Aslam​&amp;nbsp; I didn't think to look there before since I hadn't tried to run any queries.  I see the failed SHOW DATABASES queries in history and they identify the error: &lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;Builtin jars can only be used when hive execution version == hive metastore version. Execution: 2.3.9 != Metastore: 2.3.7. Specify a valid path to the correct hive jars using spark.sql.hive.metastore.jars or change spark.sql.hive.metastore.version to 2.3.9.&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;My Data Engineering clusters are running the 9.1 LTS runtime and it looks like SQL is running 10.0.x-photon-scala2.12. I updated my SQL Endpoint spark.sql.hive.metastore.version setting to 2.3.9 which fixed the issue. Thank you!&lt;/P&gt;</description>
      <pubDate>Mon, 27 Dec 2021 15:30:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/cannot-get-databricks-sql-to-read-external-hive-metastore/m-p/32712#M23852</guid>
      <dc:creator>TimK</dc:creator>
      <dc:date>2021-12-27T15:30:07Z</dc:date>
    </item>
  </channel>
</rss>

