cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

New Databricks Driver gives SQLNonTransientConnectionException when trying to connect to Databricks Instance

sriramkumar
New Contributor II
import com.databricks.client.jdbc.DataSource;
 
import java.sql.*;
 
public class testDatabricks {
    public static void main(String[] args) throws SQLException {
        String dbUrl = "jdbc:databricks://<hostname>:443;HttpPath=<HttpPath>;";
// Copied the hostname and httppath from configuration of instance
        DataSource d = new DataSource();
        d.setURL(dbUrl);
        d.setUserID("token");
        d.setPassword("<access-token>");
// Copied access token from user settings
        Connection conn = d.getConnection();
        Statement s = conn.createStatement();
        ResultSet r = s.executeQuery("show schemas");
        while(r.next()) {
            System.out.println(r.getString("databaseName"));
        }
        System.out.println("Success");
    }
}

I am trying to execute this code to test connection to my Databricks instance. But I get

java.sql.SQLNonTransientConnectionException: [Databricks][JDBC](12330) Cannot establish a successful connection with given properties.

error. Am I missing anything in the setup?

Also, why is the driver guide (Installation and Configuration) missing in then docs folder of the driver?

3 REPLIES 3

Atanu
Esteemed Contributor
Esteemed Contributor

This looks like due to maintenance on US . Are you still facing the issue @Sriramkumar Thamizharasanโ€‹ Is your workspace on eastus and eastus2 ?

sriramkumar
New Contributor II

My workspace is in us-west-2 region. I still face the issue. Although, interestingly, if I keep the code same and just change the JDBC URL to start with jdbc:spark:// instead of jdbc:databricks://, the connection is successful.

Also, could we get the Installation and Configuration Guide that comes with the driver?

sriramkumar
New Contributor II

Looked into the decompiled source and found that only "spark" is added as sub protocol in DataSource class but in Driver class "databricks" is added.