I have followed the documentation and using the same metastore config that is working in the Data Engineering context. When attempting to view the Databases, I get the error:
Encountered an internal error
The following information failed to load:
- The list of databases in hive_metastore catalog
Please try again or contact your Databricks representative if the issue persists.
My SQL Endpoint config is:
spark.hadoop.javax.jdo.option.ConnectionURL {{secrets/key-vault-secrets/Metastore-ConnectionURL}}
spark.hadoop.javax.jdo.option.ConnectionUserName {{secrets/key-vault-secrets/Metastore-ConnectionUserName}}
spark.hadoop.javax.jdo.option.ConnectionPassword {{secrets/key-vault-secrets/Metastore-ConnectionPassword}}
spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver
spark.sql.hive.metastore.version {{secrets/key-vault-secrets/Metastore-Version}}
spark.sql.hive.metastore.jars {{secrets/key-vault-secrets/Metastore-Jars}}
spark.hadoop.fs.azure.account.auth.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net OAuth
spark.hadoop.fs.azure.account.oauth.provider.type.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark.hadoop.fs.azure.account.oauth2.client.id.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-ServiceAccount-SQLDataAccess}}
spark.hadoop.fs.azure.account.oauth2.client.secret.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net {{secrets/key-vault-secrets/Lakehouse-SQLDataAccess-Secret}}
spark.hadoop.fs.azure.account.oauth2.client.endpoint.{{secrets/key-vault-secrets/Lakehouse-Account}}.dfs.core.windows.net https://login.microsoftonline.com/{{secrets/key-vault-secrets/Tenant-Id}}/oauth2/token