cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue when using M2M authentication with azure databricks jdbc driver 2.7.1

jakobhaggstrom
New Contributor

Hi!

I try to connect to an azure databricks sql warehouse in dbeaver, which uses the azure databricks jdbc driver version 2.7.1 ## and I cannot get M2M authentication to work. I get a 'Not Authorized' (401) response when I try to connect, and it seems like the issue lies in the retrieval of the token. Here's the trace log from Dbeaver

Apr 08 09:29:57.486 TRACE 48 com.databricks.client.hivecommon.api.HS2Client.createOpenSessionReq(): +++++ enter +++++
Apr 08 09:29:57.492 DEBUG 48 com.databricks.client.hivecommon.api.HS2Client.createOpenSessionReq: Setting client protocol as unset to get highest supported version by the server
Apr 08 09:29:57.492 DEBUG 48 com.databricks.client.hivecommon.api.HS2Client.createOpenSessionReq: Setting GetInfos in the OpenSession request
Apr 08 09:29:57.493 DEBUG 48 com.databricks.client.hivecommon.api.TEHTTPSettings.setThriftSessionTag: Setting Thrift session tag in HTTP header: 3082e20f-af3c-4084-afb7-dab97ed6555d
Apr 08 09:29:57.494 TRACE 48 com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(TOpenSessionReq(configuration:{logLevel=TRACE}, getInfos:[CLI_DBMS_NAME, CLI_DBMS_VER], client_protocol_i64:42248, connectionProperties:{UseNativeQuery=2, UseProxy=0}, initialNamespace:TNamespace(schemaName:schemaname), canUseMultipleCatalogs:true)): +++++ enter +++++
Apr 08 09:29:57.494 TRACE 48 com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(): +++++ enter +++++
Apr 08 09:29:57.496 TRACE 48 com.databricks.client.jdbc.oauth.OAuthFactory.clientCredentialOAuth(com.databricks.client.hivecommon.HiveJDBCSettings@41d4f955, com.databricks.client.jdbc.common.SSLSettings@63b4c461, com.databricks.client.hivecommon.core.HiveJDBCCommonConnectionLogger@6ddfffff): +++++ enter +++++
Apr 08 09:29:57.508 TRACE 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry(https://host.azuredatabricks.net/oidc/oauth2/v2.0/token, POST https://host.azuredatabricks.net/oidc/oauth2/v2.0/token HTTP/1.1): +++++ enter +++++
Apr 08 09:29:57.511 DEBUG 48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: Timeout for OAuth HTTP request is 900000 minutes.
Apr 08 09:29:57.788 WARN  48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: Got response code 401
Apr 08 09:29:57.788 WARN  48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: OAuth HTTP request for token was unsuccessful. Received HTTP status code 401
Apr 08 09:29:57.788 WARN  48 com.databricks.client.jdbc.oauth.OAuthFactory.executeRequestWithRetry: This HTTP status code is not listed in triable HTTP code setting. This request will not be retried
Apr 08 09:29:57.788 TRACE 48 com.databricks.client.jdbc.oauth.OAuthFactory.checkResponse(): +++++ enter +++++
Apr 08 09:29:57.791 ERROR 48 com.databricks.client.exceptions.ExceptionConverter.toSQLException: [Databricks][JDBCDriver](500151) Error setting/closing session: 401 Unauthorized .
java.sql.SQLException: [Databricks][JDBCDriver](500151) Error setting/closing session: 401 Unauthorized .
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:109)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:83)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:214)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:133)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:160)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:61)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:125)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:125)
at org.jkiss.dbeaver.ext.databricks.DatabricksDataSource.<init>(DatabricksDataSource.java:36)
at org.jkiss.dbeaver.ext.databricks.model.DatabricksMetaModel.createDataSourceImpl(DatabricksMetaModel.java:68)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:57)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.openDataSource(DataSourceDescriptor.java:1417)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect0(DataSourceDescriptor.java:1280)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:1070)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:78)
at org.jkiss.dbeaver.runtime.jobs.ConnectionTestJob.run(ConnectionTestJob.java:102)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:119)
Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][JDBCDriver](500151) Error setting/closing session: 401 Unauthorized .
... 30 more
Caused by: com.databricks.client.jdbc42.internal.apache.thrift.TException: 401 Unauthorized 
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.executeClientCredentialAuthFlow(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:109)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:83)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:214)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:133)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:160)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:61)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:125)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:125)
at org.jkiss.dbeaver.ext.databricks.DatabricksDataSource.<init>(DatabricksDataSource.java:36)
at org.jkiss.dbeaver.ext.databricks.model.DatabricksMetaModel.createDataSourceImpl(DatabricksMetaModel.java:68)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:57)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.openDataSource(DataSourceDescriptor.java:1417)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect0(DataSourceDescriptor.java:1280)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:1070)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:78)
at org.jkiss.dbeaver.runtime.jobs.ConnectionTestJob.run(ConnectionTestJob.java:102)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:119)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)

2025-04-08 09:29:57.813 - 401 Unauthorized 
com.databricks.client.jdbc42.internal.apache.thrift.TException: 401 Unauthorized 
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.executeClientCredentialAuthFlow(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)
at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:109)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCConnectionOpener.run(JDBCConnectionOpener.java:83)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:214)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.openConnection(JDBCDataSource.java:133)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.openConnection(GenericDataSource.java:160)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCExecutionContext.connect(JDBCExecutionContext.java:124)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.initializeMainContext(JDBCRemoteInstance.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCRemoteInstance.<init>(JDBCRemoteInstance.java:61)
at org.jkiss.dbeaver.model.impl.jdbc.JDBCDataSource.initializeRemoteInstance(JDBCDataSource.java:125)
at org.jkiss.dbeaver.ext.generic.model.GenericDataSource.<init>(GenericDataSource.java:125)
at org.jkiss.dbeaver.ext.databricks.DatabricksDataSource.<init>(DatabricksDataSource.java:36)
at org.jkiss.dbeaver.ext.databricks.model.DatabricksMetaModel.createDataSourceImpl(DatabricksMetaModel.java:68)
at org.jkiss.dbeaver.ext.generic.GenericDataSourceProvider.openDataSource(GenericDataSourceProvider.java:57)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.openDataSource(DataSourceDescriptor.java:1417)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect0(DataSourceDescriptor.java:1280)
at org.jkiss.dbeaver.registry.DataSourceDescriptor.connect(DataSourceDescriptor.java:1070)
at org.jkiss.dbeaver.runtime.jobs.ConnectJob.run(ConnectJob.java:78)
at org.jkiss.dbeaver.runtime.jobs.ConnectionTestJob.run(ConnectionTestJob.java:102)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:119)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)

This is the jdbc url 

jdbc:databricks://host.azuredatabricks.net:443/schemaname;
ssl=1;
httpPath=path;
AuthMech=11;
Auth_Flow=1;
OAuth2ClientId=clientid;
OAuth2Secret=clientsecret;
 
Has anyone encountered the same issue as me? Did you manage to solve it? I'm using the parameters as the documentation instructs, and the same authentication method with the same service principal works while running dbt commands. Therefore, the credentials should be correct as well.
2 REPLIES 2

Renu_
Contributor

Hi @jakobhaggstrom, this error likely occurs due to the type of secret you're using. For M2M authentication, the Databricks JDBC driver requires a Databricks generated OAuth secret, not a Microsoft Entra ID client secret. While your service principal credentials may work with other tools like dbt, JDBC connections will fail if the wrong secret type is provided.

Try switching to a Databricks OAuth secret and see if that resolves the issue.

kamal_ch
Databricks Employee
Databricks Employee

Additional information can be found here

https://docs.databricks.com/aws/en/dev-tools/auth/oauth-m2m

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now