cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

exception when using java SQL client

amitca71
Contributor II

Hi,

I try to use java sql. i can see that the query on databricks is executed properly.

However, on my client i get exception (see below).

versions:

jdk: jdk-20.0.1 (tryed also with version 16, same results)

https://www.oracle.com/il-en/java/technologies/downloads/#jdk20-mac

<dependency>

<groupId>com.databricks</groupId>

<artifactId>databricks-jdbc</artifactId>

<version>2.6.33</version>

</dependency>

(other versions were also given same error)

-javaagent:/Applications/IntelliJ IDEA CE.app/Contents/lib/idea_rt.jar=64814:/Applications/IntelliJ IDEA CE.app/Contents/bin -Dfile.encoding=UTF-8 -Dsun.stdout.encoding=UTF-8 -Dsun.stderr.encoding=UTF-8 -classpath /Users/amitca/technology/databrickssql/target/classes:/Users/amitca/.m2/repository/com/databricks/databricks-jdbc/2.6.33/databricks-jdbc-2.6.33.jar org.example.Main

WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.

java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500540) Error caught in BackgroundFetcher. Foreground thread ID: 1. Background thread ID: 37. Error caught: null.

at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source)

at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:577)

at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)

at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)

at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)

Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][DatabricksJDBCDriver](500540) Error caught in BackgroundFetcher. Foreground thread ID: 1. Background thread ID: 37. Error caught: null.

... 5 more

Caused by: java.lang.ExceptionInInitializerError

at com.databricks.client.jdbc42.internal.apache.arrow.memory.ArrowBuf.getDirectBuffer(ArrowBuf.java:228)

at com.databricks.client.jdbc42.internal.apache.arrow.memory.ArrowBuf.nioBuffer(ArrowBuf.java:223)

at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.ReadChannel.readFully(ReadChannel.java:87)

at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.message.MessageSerializer.readMessageBody(MessageSerializer.java:727)

at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.message.MessageSerializer.deserializeRecordBatch(MessageSerializer.java:363)

at com.databricks.client.spark.arrow.ArrowBuffer.deserializeBatch(Unknown Source)

at com.databricks.client.spark.arrow.ArrowBuffer.handleInitializeBuffer(Unknown Source)

at com.databricks.client.hivecommon.api.HiveServer2BaseBuffer.initializeBuffer(Unknown Source)

at com.databricks.client.hivecommon.api.RowsetBuffer.initializeBuffer(Unknown Source)

at com.databricks.client.hivecommon.api.HS2Client.getRowSetInformation(Unknown Source)

at com.databricks.client.hivecommon.api.HS2Client.fetchFromServer(Unknown Source)

at com.databricks.client.spark.jdbc.DowloadableFetchClient.fetchNRows(Unknown Source)

at com.databricks.client.hivecommon.api.HS2Client.fetchRows(Unknown Source)

at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source)

at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:577)

at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)

at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)

at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)

at java.base/java.lang.Thread.run(Thread.java:1623)

Caused by: java.lang.RuntimeException: Failed to initialize MemoryUtil.

at com.databricks.client.jdbc42.internal.apache.arrow.memory.util.MemoryUtil.<clinit>(MemoryUtil.java:136)

... 19 more

Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field long java.nio.Buffer.address accessible: module java.base does not "opens java.nio" to unnamed module @5a9d6f02

at java.base/java.lang.reflect.AccessibleObject.throwInaccessibleObjectException(AccessibleObject.java:387)

at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:363)

at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:311)

at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:181)

at java.base/java.lang.reflect.Field.setAccessible(Field.java:175)

at com.databricks.client.jdbc42.internal.apache.arrow.memory.util.MemoryUtil.<clinit>(MemoryUtil.java:84)

... 19 more

Any idea?

Thanks.

Amit

1 ACCEPTED SOLUTION

Accepted Solutions

amitca71
Contributor II

it was java version issue (i work on mac).

didnt work with openjdk20

did work with adoptopenjdk-8.jdk

View solution in original post

4 REPLIES 4

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, Could you please clarify when you say your client is it the tool in your local system?

Please tag @Debayanโ€‹ with your next comment so that I will get notified. Thank you!

amitca71
Contributor II

it was java version issue (i work on mac).

didnt work with openjdk20

did work with adoptopenjdk-8.jdk

ameyabapat
New Contributor II

I am getting same error. I have added 

'--add-opens=java.base/java.nio=ALL-UNNAMED'

in jvmargs and using java

openjdk 17 2021-09-14
OpenJDK Runtime Environment (build 17+35-2724)
OpenJDK 64-Bit Server VM (build 17+35-2724, mixed mode, sharing)
Any solution?

How do I add this same jvmarg into pom.xml ?

I tried adding through surefire-plugin but it does not take the arg. can you please help.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.