cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Cannot Get Databricks SQL to read external Hive Metastore

TimK
New Contributor II

I have followed the documentation and using the same metastore config that is working in the Data Engineering context. When attempting to view the Databases, I get the error:

Encountered an internal error

The following information failed to load:

  • The list of databases in hive_metastore catalog

Please try again or contact your Databricks representative if the issue persists.

My SQL Endpoint config is:

spark.hadoop.javax.jdo.option.ConnectionURL {{secrets/key-vault-secrets/Metastore-ConnectionURL}}
spark.hadoop.javax.jdo.option.ConnectionUserName {{secrets/key-vault-secrets/Metastore-ConnectionUserName}}
spark.hadoop.javax.jdo.option.ConnectionPassword {{secrets/key-vault-secrets/Metastore-ConnectionPassword}}
spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver
spark.sql.hive.metastore.version {{secrets/key-vault-secrets/Metastore-Version}}
spark.sql.hive.metastore.jars {{secrets/key-vault-secrets/Metastore-Jars}}
spark.hadoop.fs.azure.account.auth.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net OAuth
spark.hadoop.fs.azure.account.oauth.provider.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark.hadoop.fs.azure.account.oauth2.client.id.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-ServiceAccount-SQLDataAccess}}
spark.hadoop.fs.azure.account.oauth2.client.secret.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-SQLDataAccess-Secret}}
spark.hadoop.fs.azure.account.oauth2.client.endpoint.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net https://login.microsoftonline.com/{{secrets/key-vault-secrets/Tenant-Id}}/oauth2/token

1 ACCEPTED SOLUTION

Accepted Solutions

BilalAslamDbrx
Databricks Employee
Databricks Employee

@Tim Kracht​  this shouldn't be happening. Go to Query History, pick a query, go to details, then environment and look for 

spark.databricks.clusterUsageTags.sparkVersion

What does this say?

View solution in original post

2 REPLIES 2

BilalAslamDbrx
Databricks Employee
Databricks Employee

@Tim Kracht​  this shouldn't be happening. Go to Query History, pick a query, go to details, then environment and look for 

spark.databricks.clusterUsageTags.sparkVersion

What does this say?

TimK
New Contributor II

@Bilal Aslam​  I didn't think to look there before since I hadn't tried to run any queries. I see the failed SHOW DATABASES queries in history and they identify the error:

Builtin jars can only be used when hive execution version == hive metastore version. Execution: 2.3.9 != Metastore: 2.3.7. Specify a valid path to the correct hive jars using spark.sql.hive.metastore.jars or change spark.sql.hive.metastore.version to 2.3.9.

My Data Engineering clusters are running the 9.1 LTS runtime and it looks like SQL is running 10.0.x-photon-scala2.12. I updated my SQL Endpoint spark.sql.hive.metastore.version setting to 2.3.9 which fixed the issue. Thank you!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group