cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks JDBC Driver 2.7.3 with OAuth2 M2M on Databricks

der
New Contributor III

We have an application implemented in Java and installed as JAR on the cluster. The application reads data from unity catalog over Databricks JDBC Driver.

We used PAT Tokens for the Service Principal in the past and everything worked fine. Now we changed to OAuth2 Secrets generated in Databricks Account. This works perfect, if the JDBC Driver runs outside Databricks Cluster. If we run same code in Databricks Notebook, we get an error.

Scala code run in Notebook

import java.sql.{Connection, DriverManager, ResultSet}

Class.forName("com.databricks.client.jdbc.Driver")

val jdbcUrl = "jdbc:databricks://<HOST>:443;httpPath=<HTTP_PATH>;AuthMech=11;Auth_Flow=1;OAuth2ClientId=<CLIENT_ID>;OAuth2Secret=<DATABRICKS_GENERATED_SP_SECRET>"

val conn: Connection = DriverManager.getConnection(jdbcUrl)
val stmt = conn.createStatement()
val rs = stmt.executeQuery("SELECT current_user(), current_date()")

while (rs.next()) {
  println(s"${rs.getString(1)}\t${rs.getString(2)}")
}

rs.close()
stmt.close()
conn.close()

Error:

SQLException: [Databricks][JDBCDriver](500151) Error setting/closing session: Invalid local Address .
Caused by: GeneralException: [Databricks][JDBCDriver](500151) Error setting/closing session: Invalid local Address .
Caused by: TException: Invalid local Address 
	at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
	at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
	at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
	at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
	at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
	at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
	at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
	at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
	at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
	at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][JDBCDriver](500151) Error setting/closing session: Invalid local Address .
	at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
	at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
	at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
	at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
	at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
	at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
	at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
	at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
	at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
	at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
Caused by: com.databricks.client.jdbc42.internal.apache.thrift.TException: Invalid local Address 
	at com.databricks.client.jdbc.oauth.ClientCredentialOAuthProvider.obtainAccessToken(Unknown Source)
	at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(Unknown Source)
	at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(Unknown Source)
	at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
	at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
	at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
	at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
	at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
	at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
	at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
	at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
	at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
	at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)
1 ACCEPTED SOLUTION

Accepted Solutions

der
New Contributor III

According support team. I had to set the JDBC parameter OAuthEnabledIPAddressRanges. The range of the IP should be the resolved private link IP (usually starting with 10.x) of the hostname for the Databricks workspace URL. 

View solution in original post

2 REPLIES 2

der
New Contributor III

If I generate the Auth Token by my self and use this token, it also works.

Create Token:

import java.net.{URL, HttpURLConnection}
import java.io.{OutputStreamWriter}
import scala.io.Source
import scala.util.parsing.json.JSON

// Databricks OAuth client credentials
val clientId = "<CLIENT_ID"
val clientSecret = "<DATABRICKS_GENERATED_SP_SECRET>"
val tokenEndpoint = "https://<HOST>/oidc/v1/token"

val requestBody =
  s"grant_type=client_credentials&client_id=$clientId&client_secret=$clientSecret&scope=all-apis"

val url = new URL(tokenEndpoint)
val conn = url.openConnection().asInstanceOf[HttpURLConnection]
conn.setRequestMethod("POST")
conn.setDoOutput(true)
conn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded")

val writer = new OutputStreamWriter(conn.getOutputStream)
writer.write(requestBody)
writer.flush()
writer.close()

val response = Source.fromInputStream(conn.getInputStream).mkString
val parsed = JSON.parseFull(response).get.asInstanceOf[Map[String, Any]]
val accessToken = parsed("access_token").asInstanceOf[String]

Use Auth Token without error:

import java.sql.{Connection, DriverManager}

Class.forName("com.databricks.client.jdbc.Driver")

val jdbcUrl = s"jdbc:databricks://<HOST>:443;httpPath=<HTTP_PATH>;AuthMech=11;Auth_Flow=0;Auth_AccessToken=$accessToken"

val conn: Connection = DriverManager.getConnection(jdbcUrl)
val stmt = conn.createStatement()
val rs = stmt.executeQuery("SELECT current_user(), current_date()")

while (rs.next()) {
  println(s"${rs.getString(1)}\t${rs.getString(2)}")
}

rs.close()
stmt.close()
conn.close()

 

der
New Contributor III

According support team. I had to set the JDBC parameter OAuthEnabledIPAddressRanges. The range of the IP should be the resolved private link IP (usually starting with 10.x) of the hostname for the Databricks workspace URL.