cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

JDBC Error with OpenJDK 21

dprutean
New Contributor III

Connecting to Azure Databricks using latest JDBC driver and OpenJDK 21 I got this error. 

I already set:

--add-opens=java.base/java.nio=ALL-UNNAMED
-Dio.netty.tryReflectionSetAccessible=true

 

java.sql.SQLException: [Databricks][JDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
at com.databricks.client.spark.arrow.ArrowBuffer.deserializeBatch(Unknown Source)
at com.databricks.client.spark.arrow.ArrowBuffer.handleInitializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.HiveServer2BaseBuffer.initializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.RowsetBuffer.initializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.getRowSetInformation(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.fetchFromServer(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.fetchNRows(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.fetchRows(Unknown Source)
at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
com.databricks.client.support.exceptions.GeneralException: [Databricks][JDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
at com.databricks.client.spark.arrow.ArrowBuffer.deserializeBatch(Unknown Source)
at com.databricks.client.spark.arrow.ArrowBuffer.handleInitializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.HiveServer2BaseBuffer.initializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.RowsetBuffer.initializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.getRowSetInformation(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.fetchFromServer(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.fetchNRows(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.fetchRows(Unknown Source)
at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
at com.databricks.client.jdbc42.internal.apache.arrow.memory.util.MemoryUtil.directBuffer(MemoryUtil.java:167)
at com.databricks.client.jdbc42.internal.apache.arrow.memory.ArrowBuf.getDirectBuffer(ArrowBuf.java:229)
at com.databricks.client.jdbc42.internal.apache.arrow.memory.ArrowBuf.nioBuffer(ArrowBuf.java:224)
at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.ReadChannel.readFully(ReadChannel.java:87)
at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.message.MessageSerializer.readMessageBody(MessageSerializer.java:728)
at com.databricks.client.jdbc42.internal.apache.arrow.vector.ipc.message.MessageSerializer.deserializeRecordBatch(MessageSerializer.java:363)
at com.databricks.client.spark.arrow.ArrowBuffer.deserializeBatch(Unknown Source)
at com.databricks.client.spark.arrow.ArrowBuffer.handleInitializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.HiveServer2BaseBuffer.initializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.RowsetBuffer.initializeBuffer(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.getRowSetInformation(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.fetchFromServer(Unknown Source)
at com.databricks.client.spark.jdbc.DownloadableFetchClient.fetchNRows(Unknown Source)
at com.databricks.client.hivecommon.api.HS2Client.fetchRows(Unknown Source)
at com.databricks.client.hivecommon.dataengine.BackgroundFetcher.run(Unknown Source)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1583)

1 REPLY 1

151640
New Contributor III

Version 02.06.38.1068
Using SQLSquirrel with ibm-semeru-certified-17-jdk (17.0.11+9)
With JRE options --add-opens=jdk.unsupported/sun.misc=ALL-UNNAMED fails
Should be  --add-opens=java.base/java.nio=ALL-UNNAMED

caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field long java.nio.Buffer.address accessible: module java.base does not "opens java.nio" to unnamed module @927e3d9c
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:184)
at java.base/java.lang.reflect.Field.setAccessible(Field.java:178)
at com.databricks.client.jdbc42.internal.apache.arrow.memory.util.MemoryUtil.<clinit>(MemoryUtil.java:84


Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now