JDBC Error: Error occured while deserializing arrow data
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-29-2022 07:15 PM
I am getting the following error in my Java application.
java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
I believe this is the same issue described here.
Java version:
java version "18.0.1.1" 2022-04-22
Java(TM) SE Runtime Environment (build 18.0.1.1+2-6)
Java HotSpot(TM) 64-Bit Server VM (build 18.0.1.1+2-6, mixed mode, sharing)
I am using the recommended flag:
-Dio.netty.tryReflectionSetAccessible=true
Is the source code for the JDBC driver available? Any suggestions on how to fix this would be appreciated. Also if there is an example app that uses the JDBC driver, I'd be interested in trying that as well. Thanks!
- Labels:
-
Deserializing Arrow Data
-
Jdbc
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-27-2022 11:27 AM
Please try adding the below(--add-opens flag) java command line flags in your jvm call:
% javac -classpath SparkJDBC42Example.jar:. jdbc_example.java
% java --add-opens=java.base/java.nio=ALL-UNNAMED -classpath SparkJDBC42Example.jar:. jdbc_example
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-18-2025 12:27 AM
For anyone encountering this issue in 2025, I was able to solve it by using the
--add-opens=jdk.unsupported/sun.misc=ALL-UNNAMED
option in combination with the latest jdbc driver (v2.7.1). I was using the driver in dbeaver, but I assume the issue could be also solved in similar environments.

