cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

JDBC Driver Error

dprutean
New Contributor III

connect to databricks unity catalog and meet this error java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500540) Error caught in BackgroundFetcher. Foreground thread ID: 59. Background thread ID: 61. Error caught: null. at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) com.databricks.client.support.exceptions.GeneralException: [Databricks][DatabricksJDBCDriver](500540) Error caught in BackgroundFetcher. Foreground thread ID: 59. Background thread ID: 61. Error caught: null. at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) java.lang.ExceptionInInitializerError at com.databricks.client.jdbc42.internal.apache.arrow.memory.ArrowBuf.getDirectBuffer(ArrowBuf.java:228) at com.databricks.client.jdbc42.internal.apache.arrow.memory.ArrowBuf.nioBuffer(ArrowBuf.java:223) at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.ReadChannel.readFully(ReadChannel.java:87) at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.message.MessageSerializer.readMessageBody(MessageSerializer.java:727) at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.message.MessageSerializer.deserializeRecordBatch(MessageSerializer.java:363) at com.databricks.client.spark.arrow.ArrowBuffer.deserializeBatch(Unknown Source) at com.databricks.client.spark.arrow.ArrowBuffer.handleInitializeBuffer(Unknown Source) at com.databricks.client.hivecommon.api.HiveServer2BaseBuffer.initializeBuffer(Unknown Source) at com.databricks.client.hivecommon.api.RowsetBuffer.initializeBuffer(Unknown Source) at com.databricks.client.hivecommon.api.HS2Client.getRowSetInformation(Unknown Source) at com.databricks.client.hivecommon.api.HS2Client.fetchFromServer(Unknown Source) at com.databricks.client.spark.jdbc.DowloadableFetchClient.fetchNRows(Unknown Source) at com.databricks.client.hivecommon.api.HS2Client.fetchRows(Unknown Source) at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833)

1 REPLY 1

Kaniz
Community Manager
Community Manager

Hi @dpruteanThank you for providing the details about the error you’re encountering while connecting to the Databricks Unity Catalog using the Databricks JDBC driver.

Let’s troubleshoot this step by step:

  1. Check your connection string:

    • The connection string you’ve shared appears to be in the correct format. However, let’s verify a few things:
      • Replace <hostname> with the actual hostname or IP address of your Databricks cluster.
      • Ensure that the token parameter contains a valid access token.
      • Confirm that the transportMode is set to http or https based on your cluster configuration.
      • Make sure the port number (usually 443) is correct.
    • If you’ve double-checked these parameters and they are correct, proceed to the next step.
  2. Update your JDBC driver:

    • It’s essential to use the latest version of the Databricks JDBC driver to avoid compatibility issues. You can download the driver from the Databricks website.
    • Replace your existing driver JAR file with the new one and restart your application or service that uses the JDBC driver.
    • Test the connection again after updating the driver.
  3. Network and firewall settings:

    • Ensure that there are no network or firewall restrictions preventing your application from connecting to the Databricks Unity Catalog.
    • Check if your network allows outbound connections to the Databricks cluster.
    • Verify that the necessary ports (e.g., 443 for HTTPS) are open.
    •  
  4. Additional information:

    • Could you provide more context about your setup? For example:
      • Are you running this from a specific programming language or tool (e.g., Python, Java, SQL client)?
      • Are there any specific steps you’re following to connect to the Unity Catalog?
      • Any additional logs or error messages you can share would be helpful.

Feel free to provide more details, and we’ll continue troubleshooting together! 😊