cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Cannot Get Databricks SQL to read external Hive Metastore

TimK
New Contributor II

I have followed the documentation and using the same metastore config that is working in the Data Engineering context. When attempting to view the Databases, I get the error:

Encountered an internal error

The following information failed to load:

  • The list of databases in hive_metastore catalog

Please try again or contact your Databricks representative if the issue persists.

My SQL Endpoint config is:

spark.hadoop.javax.jdo.option.ConnectionURL {{secrets/key-vault-secrets/Metastore-ConnectionURL}}
spark.hadoop.javax.jdo.option.ConnectionUserName {{secrets/key-vault-secrets/Metastore-ConnectionUserName}}
spark.hadoop.javax.jdo.option.ConnectionPassword {{secrets/key-vault-secrets/Metastore-ConnectionPassword}}
spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver
spark.sql.hive.metastore.version {{secrets/key-vault-secrets/Metastore-Version}}
spark.sql.hive.metastore.jars {{secrets/key-vault-secrets/Metastore-Jars}}
spark.hadoop.fs.azure.account.auth.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net OAuth
spark.hadoop.fs.azure.account.oauth.provider.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark.hadoop.fs.azure.account.oauth2.client.id.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-ServiceAccount-SQLDataAccess}}
spark.hadoop.fs.azure.account.oauth2.client.secret.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-SQLDataAccess-Secret}}
spark.hadoop.fs.azure.account.oauth2.client.endpoint.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net https://login.microsoftonline.com/{{secrets/key-vault-secrets/Tenant-Id}}/oauth2/token

1 ACCEPTED SOLUTION

Accepted Solutions

BilalAslamDbrx
Honored Contributor II
Honored Contributor II

@Tim Kracht​  this shouldn't be happening. Go to Query History, pick a query, go to details, then environment and look for 

spark.databricks.clusterUsageTags.sparkVersion

What does this say?

View solution in original post

3 REPLIES 3

Kaniz
Community Manager
Community Manager

Hi @TimK ! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

BilalAslamDbrx
Honored Contributor II
Honored Contributor II

@Tim Kracht​  this shouldn't be happening. Go to Query History, pick a query, go to details, then environment and look for 

spark.databricks.clusterUsageTags.sparkVersion

What does this say?

TimK
New Contributor II

@Bilal Aslam​  I didn't think to look there before since I hadn't tried to run any queries. I see the failed SHOW DATABASES queries in history and they identify the error:

Builtin jars can only be used when hive execution version == hive metastore version. Execution: 2.3.9 != Metastore: 2.3.7. Specify a valid path to the correct hive jars using spark.sql.hive.metastore.jars or change spark.sql.hive.metastore.version to 2.3.9.

My Data Engineering clusters are running the 9.1 LTS runtime and it looks like SQL is running 10.0.x-photon-scala2.12. I updated my SQL Endpoint spark.sql.hive.metastore.version setting to 2.3.9 which fixed the issue. Thank you!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.