03-18-2022 04:51 AM
I am trying to use Databricks Delta Lake Sink Connector(confluent cloud ) and write to S3 .
the connector starts up with the following error . Any help on this could be appreciated
org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.
at io.confluent.connect.databricks.deltalake.DatabricksDeltaLakeSinkTask.deltaLakeConnection(DatabricksDeltaLakeSinkTask.java:253)
at io.confluent.connect.databricks.deltalake.DatabricksDeltaLakeSinkTask.start(DatabricksDeltaLakeSinkTask.java:88)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:305)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:193)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.sql.SQLException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.
at com.simba.spark.hivecommon.api.HS2Client.handleTTransportException(Unknown Source)
at com.simba.spark.spark.jdbc.DowloadableFetchClient.handleTTransportException(Unknown Source)
at com.simba.spark.hivecommon.api.HS2Client.openSession(Unknown Source)
at com.simba.spark.hivecommon.api.HS2Client.<init>(Unknown Source)
at com.simba.spark.spark.jdbc.DowloadableFetchClient.<init>(Unknown Source)
at com.simba.spark.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)
at com.simba.spark.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)
at com.simba.spark.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)
at com.simba.spark.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
at com.simba.spark.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)
... 4 more
Caused by: com.simba.spark.support.exceptions.ErrorException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.
... 14 more
Caused by: com.simba.spark.jdbc42.internal.apache.thrift.transport.TTransportException: HTTP Response code: 403, Error message: Unknown
at com.simba.spark.hivecommon.api.TETHttpClient.handleHeaderErrorMessage(Unknown Source)
at com.simba.spark.hivecommon.api.TETHttpClient.handleErrorResponse(Unknown Source)
at com.simba.spark.hivecommon.api.TETHttpClient.flushUsingHttpClient(Unknown Source)
at com.simba.spark.hivecommon.api.TETHttpClient.flush(Unknown Source)
at com.simba.spark.jdbc42.internal.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)
at com.simba.spark.jdbc42.internal.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
at com.simba.spark.jdbc42.internal.apache.hive.service.rpc.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:147)
at com.simba.spark.hivecommon.api.HS2ClientWrapper.send_OpenSession(Unknown Source)
at com.simba.spark.jdbc42.internal.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:139)
at com.simba.spark.hivecommon.api.HS2ClientWrapper.callOpenSession(Unknown Source)
at com.simba.spark.hivecommon.api.HS2ClientWrapper.access$1700(Unknown Source)
at com.simba.spark.hivecommon.api.HS2ClientWrapper$18.clientCall(Unknown Source)
at com.simba.spark.hivecommon.api.HS2ClientWrapper$18.clientCall(Unknown Source)
at com.simba.spark.hivecommon.api.HS2ClientWrapper.executeWithRetry(Unknown Source)
at com.simba.spark.hivecommon.api.HS2ClientWrapper.OpenSession(Unknown Source)
03-18-2022 05:06 AM
It is connection issue. You can just download JDBC/ODBC driver to your local computer and validate all connection settings. You can also test connection using BI tools like PowerBI. If settings are ok, one common issue can be that your ip is blocked by security groups
03-18-2022 05:06 AM
It is connection issue. You can just download JDBC/ODBC driver to your local computer and validate all connection settings. You can also test connection using BI tools like PowerBI. If settings are ok, one common issue can be that your ip is blocked by security groups
03-18-2022 05:09 AM
Hi @Hubert Dudek
thanks for the response !
Yes I did try it with a sql client and spark simba jdbc driver and it works fine . But from an ec2 box where we run our kafka connector it fails with the above error . Is there any permission that has to be done from data bricks (other than just using the token of a user )
03-18-2022 05:21 AM
there is also staging S3 bucket in configuration. You can check is anything written there. It can be issue with access to S3 as well.
Please check also that all is in the same region. Problem is that this error doesn't provide too much information.
03-22-2022 05:36 AM
Hi @Bency Mathew , Did you follow the original doc?
03-24-2022 11:53 AM
Hi @Kaniz Fatma yes we did , looks like it was indeed a whitelisting issue . Thanks
@Hubert Dudek @Kaniz Fatma
03-24-2022 12:06 PM
Great to hear. Could you choose my answer as the best one?
03-24-2022 12:08 PM
@Hubert Dudek sure just did , thanks again
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group