We have an application implemented in Java and installed as JAR on the cluster. The application reads data from unity catalog over Databricks JDBC Driver.
We used PAT Tokens for the Service Principal in the past and everything worked fine. Now we changed to OAuth2 Secrets generated in Databricks Account. This works perfect, if the JDBC Driver runs outside Databricks Cluster. If we run same code in Databricks Notebook, we get an error.
Scala code run in Notebook
import java.sql.{Connection, DriverManager, ResultSet}
Class.forName("com.databricks.client.jdbc.Driver")
val jdbcUrl = "jdbc:databricks://<HOST>:443;httpPath=<HTTP_PATH>;AuthMech=11;Auth_Flow=1;OAuth2ClientId=<CLIENT_ID>;OAuth2Secret=<DATABRICKS_GENERATED_SP_SECRET>"
val conn: Connection = DriverManager.getConnection(jdbcUrl)
val stmt = conn.createStatement()
val rs = stmt.executeQuery("SELECT current_user(), current_date()")
while (rs.next()) {
println(s"${rs.getString(1)}\t${rs.getString(2)}")
}
rs.close()
stmt.close()
conn.close()
Error:
SQLException: [Databricks][JDBCDriver](500151) Error setting/closing session: Invalid local Address .
Caused by: GeneralException: [Databricks][JDBCDriver](500151) Error setting/closing session: Invalid local Address .
Caused by: TException: Invalid local Address
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][JDBCDriver](500151) Error setting/closing session: Invalid local Address .
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
Caused by: com.databricks.client.jdbc42.internal.apache.thrift.TException: Invalid local Address
at com.databricks.client.jdbc.oauth.ClientCredentialOAuthProvider.obtainAccessToken(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.validateTokens(Unknown Source)
at com.databricks.client.hivecommon.api.HS2OAuthClientWrapper.OpenSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.<init>(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)
at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at com.databricks.client.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)