04-08-2025 01:29 AM
Hi!
I try to connect to an azure databricks sql warehouse in dbeaver, which uses the azure databricks jdbc driver version 2.7.1 ## and I cannot get M2M authentication to work. I get a 'Not Authorized' (401) response when I try to connect, and it seems like the issue lies in the retrieval of the token. Here's the trace log from Dbeaver
Apr 08 09:29:57.486 TRACE 48 com.databricks.client.hivecommon.api.HS2Client.createOpenSessionReq(): +++++ enter +++++
Apr 08 09:29:57.492 DEBUG 48 com.databricks.client.hivecommon.api.HS2Client.createOpenSessionReq: Setting client protocol as unset to get highest supported version by the server
Apr 08 09:29:57.492 DEBUG 48 com.databricks.client.hivecommon.api.HS2Client.createOpenSessionReq: Setting GetInfos in the OpenSession request
Apr 08 09:29:57.493 DEBUG 48 com.databricks.client.hivecommon.api.TEHTTPSettings.setThriftSessionTag: Setting Thrift session tag in HTTP header: 3082e20f-af3c-4084-afb7-dab97ed6555d
Apr 08 09:29:57.494 TRACE 48 com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(TOpenSessionReq(configuration:{logLevel=TRACE}, getInfos:[CLI_DBMS_NAME, CLI_DBMS_VER], client_protocol_i64:42248, connectionProperties:{UseNativeQuery=2, UseProxy=0}, initialNamespace:TNamespace(schemaName:schemaname), canUseMultipleCatalogs:true)): +++++ enter +++++
Apr 08 09:29:57.494 TRACE 48 com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(): +++++ enter +++++
Apr 08 09:29:57.496 TRACE 48 com.databricks.client.jdbc.oauth.OAuthFactory.clientCredentialOAuth(com.databricks.client.hivecommon.HiveJDBCSettings@41d4f955, com.databricks.client.jdbc.common.SSLSettings@63b4c461, com.databricks.client.hivecommon.core.HiveJDBCCommonConnectionLogger@6ddfffff): +++++ enter +++++
Apr 08 09:29:57.508 TRACE 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry(https://host.azuredatabricks.net/oidc/oauth2/v2.0/token, POST https://host.azuredatabricks.net/oidc/oauth2/v2.0/token HTTP/1.1): +++++ enter +++++
Apr 08 09:29:57.511 DEBUG 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: Timeout for OAuth HTTP request is 900000 minutes.
Apr 08 09:29:57.788 WARN 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: Got response code 401
Apr 08 09:29:57.788 WARN 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: OAuth HTTP request for token was unsuccessful. Received HTTP status code 401
Apr 08 09:29:57.788 WARN 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: This HTTP status code is not listed in triable HTTP code setting. This request will not be retried
Apr 08 09:29:57.788 TRACE 48 com.databricks.client.jdbc.oauth.OAuthFactory.checkResponse(): +++++ enter +++++
Apr 08 09:29:57.791 ERROR 48 com.databricks.client.exceptions.ExceptionConverter.toSQLException: [Databricks][JDBCDriver](500151) Error setting/closing session: 401 Unauthorized .
java.sql.SQLException: [Databricks][JDBCDriver](500151) Error setting/closing session: 401 Unauthorized .
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:109)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:83)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:214)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:133)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:160)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:61)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:125)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:125)
at org.jkiss.dbeaver.ext.databricks.DatabricksDataSource.<init>(DatabricksDataSource.java:36)
at org.jkiss.dbeaver.ext.databricks.model.DatabricksMetaModel.createDataSourceImpl(DatabricksMetaModel.java:68)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:57)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.openDataSource(DataSourceDescriptor.java:1417)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect0(DataSourceDescriptor.java:1280)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:1070)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:78)
at org.jkiss.dbeaver.runtime.jobs.ConnectionTestJob.run(ConnectionTestJob.java:102)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:119)
Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][JDBCDriver](500151) Error setting/closing session: 401 Unauthorized .
... 30 more
Caused by: com.databricks.client.jdbc42.internal.apache.thrift.TException: 401 Unauthorized
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.executeClientCredentialAuthFlow(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:109)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:83)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:214)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:133)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:160)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:61)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:125)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:125)
at org.jkiss.dbeaver.ext.databricks.DatabricksDataSource.<init>(DatabricksDataSource.java:36)
at org.jkiss.dbeaver.ext.databricks.model.DatabricksMetaModel.createDataSourceImpl(DatabricksMetaModel.java:68)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:57)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.openDataSource(DataSourceDescriptor.java:1417)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect0(DataSourceDescriptor.java:1280)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:1070)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:78)
at org.jkiss.dbeaver.runtime.jobs.ConnectionTestJob.run(ConnectionTestJob.java:102)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:119)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
2025-04-08 09:29:57.813 - 401 Unauthorized
com.databricks.client.jdbc42.internal.apache.thrift.TException: 401 Unauthorized
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.executeClientCredentialAuthFlow(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:109)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:83)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:214)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:133)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:160)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:61)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:125)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:125)
at org.jkiss.dbeaver.ext.databricks.DatabricksDataSource.<init>(DatabricksDataSource.java:36)
at org.jkiss.dbeaver.ext.databricks.model.DatabricksMetaModel.createDataSourceImpl(DatabricksMetaModel.java:68)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:57)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.openDataSource(DataSourceDescriptor.java:1417)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect0(DataSourceDescriptor.java:1280)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:1070)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:78)
at org.jkiss.dbeaver.runtime.jobs.ConnectionTestJob.run(ConnectionTestJob.java:102)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:119)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
This is the jdbc url
04-18-2025 04:19 AM - edited 04-18-2025 04:20 AM
Hi @jakobhaggstrom, this error likely occurs due to the type of secret you're using. For M2M authentication, the Databricks JDBC driver requires a Databricks generated OAuth secret, not a Microsoft Entra ID client secret. While your service principal credentials may work with other tools like dbt, JDBC connections will fail if the wrong secret type is provided.
Try switching to a Databricks OAuth secret and see if that resolves the issue.
05-08-2025 06:04 PM
Additional information can be found here
06-18-2025 07:56 AM
Hey @jakobhaggstrom,
It's possible to connect to DBX using an Azure service principal. You're just missing the OIDC URL in the JDBC driver's parameters.
The OIDC discovery is enabled by default but you need to set its URL which contains your Azure tenant ID.
OIDCDiscoveryEndpoint=https://login.microsoftonline.com/YOUR_AZURE_TENANT_ID/v2.0/.well-known/openid-configuration
Note that this parameter is needed for both JDBC and ODBC. The name and value are the same.
Last, if you look at the Databricks Java SDK for instance, the equivalent is to pass the mandatory Azure tenant ID to the SDK using:
.setAzureTenantId(YOUR_AZURE_TENANT_ID)
Geoffrey.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now