cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

[Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403

DipakBachhav
New Contributor III

I am trying to connect to databricks using java code. Can someone help me please? Here is the code so far I have got:: 

  
import java.sql.Connection; 
import java.sql.DriverManager; 
import java.sql.SQLException; 
import java.util.Properties;
 
  
  public class DatabricksSetup {
       
    public static void main(String[] args) throws SQLException {
 
      String url = "jdbc:databricks://XXXX.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/XXXXX;AuthMech=3;UID=token;PWD=XXXXXX";
 
       String username = "token";
       String password = "XXXX"; //Token generated from databricks profile page.
       Connection connection = DriverManager.getConnection(url, username, password);
    
       System.out.println("Database connected!");
 
       if(connection != null){
         System.out.println("Connection Established");
        }
       else {
         System.out.println("Connection Failed");
 
      }  
 
    }
 
  }

Below dependancy is already added :

   "com.databricks:databricks-jdbc:2.6.25"

Error :

  Exception in thread "main" java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.

  at com.databricks.client.hivecommon.api.HS2Client.handleTTransportException(Unknown Source)

  at com.databricks.client.spark.jdbc.DowloadableFetchClient.handleTTransportException(Unknown Source)

  at com.databricks.client.hivecommon.api.HS2Client.openSession(Unknown Source)

  at com.databricks.client.hivecommon.api.HS2Client.<init>(Unknown Source)

  at com.databricks.client.spark.jdbc.DowloadableFetchClient.<init>(Unknown Source)

  at com.databricks.client.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)

  at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.connectToServer(Unknown Source)

  at com.databricks.client.spark.core.SparkJDBCConnection.connectToServer(Unknown Source)

  at com.databricks.client.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)

  at com.databricks.client.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)

  at com.databricks.client.jdbc.core.LoginTimeoutConnection.connect(Unknown Source)

  at com.databricks.client.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)

  at com.databricks.client.jdbc.common.AbstractDriver.connect(Unknown Source)

  at java.sql.DriverManager.getConnection(DriverManager.java:664)

  at java.sql.DriverManager.getConnection(DriverManager.java:247)

  at digital.eComm.ui.tests.DatabricksSetup.main(DatabricksSetup.java:16)

  Caused by: com.databricks.client.support.exceptions.ErrorException: [Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.

  ... 16 more

1 ACCEPTED SOLUTION

Accepted Solutions

karthik_p
Esteemed Contributor

@Dipak Bachhav​ do you have any restriction in terms if IP to access databricks, in case of that you need to enable particular ip from security groups

View solution in original post

1 REPLY 1

karthik_p
Esteemed Contributor

@Dipak Bachhav​ do you have any restriction in terms if IP to access databricks, in case of that you need to enable particular ip from security groups

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!