<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Is it possible to set up external hive metastore with DBSQL? in Warehousing &amp; Analytics</title>
    <link>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20127#M398</link>
    <description>&lt;P&gt;@Harikrishnan Kunhumveettil​&amp;nbsp;  Using Azure databricks,  I have set up SQL Endpoint with the connection details that match with global init script. I am able to browse tables from regular cluster in Data Engineering module but i get below error when trying a query using SQL Endpoint &lt;/P&gt;&lt;P&gt;*****************&lt;/P&gt;&lt;P&gt;on the schema browser left pane error : -  Failed to load Databases list.&lt;/P&gt;&lt;P&gt;on the query results pane error:- &lt;/P&gt;&lt;P&gt;Error running query&lt;/P&gt;&lt;P&gt;org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;************************&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The SQL end point settings are as below - (user id and passwd are in azure key vault)&lt;/P&gt;&lt;P&gt;************************&lt;/P&gt;&lt;P&gt;spark.sql.hive.metastore.* true&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.auth.type.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net OAuth&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth.provider.type.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth2.client.id.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net  {{secrets/&amp;lt;secret_scope&amp;gt;/client-id}}&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth2.client.secret.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net  {{secrets/&amp;lt;secret_scope&amp;gt;/client-secret}}&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth2.client.endpoint.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net &lt;A href="https://login.microsoftonline.com/&amp;lt;tenant_id&amp;gt;/oauth2/token" target="test_blank"&gt;https://login.microsoftonline.com/&amp;lt;tenant_id&amp;gt;/oauth2/token&lt;/A&gt;&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionURL jdbc://&amp;lt;sql server name&amp;gt;.database.windows.net:1433;database=&amp;lt;db_name&amp;gt;&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionUserName metastoredbuser&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionPassword {{secrets/&amp;lt;secret_scope&amp;gt;/metastoredbpwd}}&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver&lt;/P&gt;&lt;P&gt;spark.sql.hive.metastore.version 2.3.7&lt;/P&gt;&lt;P&gt;spark.sql.hive.metastore.jars /databricks/hive_metastore_jars/*&lt;/P&gt;&lt;P&gt;************************&lt;/P&gt;</description>
    <pubDate>Thu, 09 Dec 2021 15:41:38 GMT</pubDate>
    <dc:creator>prasadvaze</dc:creator>
    <dc:date>2021-12-09T15:41:38Z</dc:date>
    <item>
      <title>Is it possible to set up external hive metastore with DBSQL?</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20125#M396</link>
      <description />
      <pubDate>Fri, 21 Mar 2025 11:43:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20125#M396</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2025-03-21T11:43:18Z</dc:date>
    </item>
    <item>
      <title>Re: Is it possible to set up external hive metastore with DBSQL?</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20126#M397</link>
      <description>&lt;P&gt;Yes, it's possible to connect to an external metastore. The configurations remain the same as a normal Databricks cluster. &lt;/P&gt;</description>
      <pubDate>Fri, 25 Jun 2021 21:31:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20126#M397</guid>
      <dc:creator>brickster_2018</dc:creator>
      <dc:date>2021-06-25T21:31:39Z</dc:date>
    </item>
    <item>
      <title>Re: Is it possible to set up external hive metastore with DBSQL?</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20127#M398</link>
      <description>&lt;P&gt;@Harikrishnan Kunhumveettil​&amp;nbsp;  Using Azure databricks,  I have set up SQL Endpoint with the connection details that match with global init script. I am able to browse tables from regular cluster in Data Engineering module but i get below error when trying a query using SQL Endpoint &lt;/P&gt;&lt;P&gt;*****************&lt;/P&gt;&lt;P&gt;on the schema browser left pane error : -  Failed to load Databases list.&lt;/P&gt;&lt;P&gt;on the query results pane error:- &lt;/P&gt;&lt;P&gt;Error running query&lt;/P&gt;&lt;P&gt;org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;************************&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The SQL end point settings are as below - (user id and passwd are in azure key vault)&lt;/P&gt;&lt;P&gt;************************&lt;/P&gt;&lt;P&gt;spark.sql.hive.metastore.* true&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.auth.type.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net OAuth&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth.provider.type.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth2.client.id.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net  {{secrets/&amp;lt;secret_scope&amp;gt;/client-id}}&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth2.client.secret.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net  {{secrets/&amp;lt;secret_scope&amp;gt;/client-secret}}&lt;/P&gt;&lt;P&gt;spark.hadoop.fs.azure.account.oauth2.client.endpoint.&amp;lt;storage_acct&amp;gt;.dfs.core.windows.net &lt;A href="https://login.microsoftonline.com/&amp;lt;tenant_id&amp;gt;/oauth2/token" target="test_blank"&gt;https://login.microsoftonline.com/&amp;lt;tenant_id&amp;gt;/oauth2/token&lt;/A&gt;&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionURL jdbc://&amp;lt;sql server name&amp;gt;.database.windows.net:1433;database=&amp;lt;db_name&amp;gt;&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionUserName metastoredbuser&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionPassword {{secrets/&amp;lt;secret_scope&amp;gt;/metastoredbpwd}}&lt;/P&gt;&lt;P&gt;spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver&lt;/P&gt;&lt;P&gt;spark.sql.hive.metastore.version 2.3.7&lt;/P&gt;&lt;P&gt;spark.sql.hive.metastore.jars /databricks/hive_metastore_jars/*&lt;/P&gt;&lt;P&gt;************************&lt;/P&gt;</description>
      <pubDate>Thu, 09 Dec 2021 15:41:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/is-it-possible-to-set-up-external-hive-metastore-with-dbsql/m-p/20127#M398</guid>
      <dc:creator>prasadvaze</dc:creator>
      <dc:date>2021-12-09T15:41:38Z</dc:date>
    </item>
  </channel>
</rss>

