cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

New Databricks Driver gives SQLNonTransientConnectionException when trying to connect to Databricks Instance

sriramkumar
New Contributor II
import com.databricks.client.jdbc.DataSource;
 
import java.sql.*;
 
public class testDatabricks {
    public static void main(String[] args) throws SQLException {
        String dbUrl = "jdbc:databricks://<hostname>:443;HttpPath=<HttpPath>;";
// Copied the hostname and httppath from configuration of instance
        DataSource d = new DataSource();
        d.setURL(dbUrl);
        d.setUserID("token");
        d.setPassword("<access-token>");
// Copied access token from user settings
        Connection conn = d.getConnection();
        Statement s = conn.createStatement();
        ResultSet r = s.executeQuery("show schemas");
        while(r.next()) {
            System.out.println(r.getString("databaseName"));
        }
        System.out.println("Success");
    }
}

I am trying to execute this code to test connection to my Databricks instance. But I get

java.sql.SQLNonTransientConnectionException: [Databricks][JDBC](12330) Cannot establish a successful connection with given properties.

error. Am I missing anything in the setup?

Also, why is the driver guide (Installation and Configuration) missing in then docs folder of the driver?

3 REPLIES 3

Atanu
Esteemed Contributor
Esteemed Contributor

This looks like due to maintenance on US . Are you still facing the issue @Sriramkumar Thamizharasan​ Is your workspace on eastus and eastus2 ?

sriramkumar
New Contributor II

My workspace is in us-west-2 region. I still face the issue. Although, interestingly, if I keep the code same and just change the JDBC URL to start with jdbc:spark:// instead of jdbc:databricks://, the connection is successful.

Also, could we get the Installation and Configuration Guide that comes with the driver?

sriramkumar
New Contributor II

Looked into the decompiled source and found that only "spark" is added as sub protocol in DataSource class but in Driver class "databricks" is added.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.