03-02-2022 08:14 AM
I have a Java program like this to test out the Databricks JDBC connection with the Databricks JDBC driver.
Connection connection = null;
try {
Class.forName(driver);
connection = DriverManager.getConnection(url, username, password);
if (connection != null) {
System.out.println("Connection Established");
} else {
System.out.println("Connection Failed");
}
Statement statement = connection.createStatement();
ResultSet rs = statement.executeQuery("select * from standard_info_service.daily_transactions");
while (rs.next()) {
System.out.print("created_date: " + rs.getInt("created_date") + ", ");
System.out.println("daily_transactions: " + rs.getInt("daily_transactions"));
}
} catch (Exception e) {
System.out.println(e);
}
This program, however, throws an error like this:
Connection Established
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
java.sql.SQLException: [Simba][SparkJDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
What will be the solution?
03-23-2022 05:19 PM
Hi @Jose Gonzalez ,
This similar issue in snowflake in JDBC is a good reference, I was able to get this to work in Java OpenJDK 17 by having this JVM option specified:
--add-opens=java.base/java.nio=ALL-UNNAMED
Although I came across another issue with using apache DHCP to connect to Databricks SQL endpoint:
Caused by: java.sql.SQLFeatureNotSupportedException: [Simba][JDBC](10220) Driver does not support this optional feature.
at com.simba.spark.exceptions.ExceptionConverter.toSQLException(Unknown Source)
at com.simba.spark.jdbc.common.SConnection.setAutoCommit(Unknown Source)
at com.simba.spark.jdbc.jdbc42.DSS42Connection.setAutoCommit(Unknown Source)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
The same problem occurred after I switched to Hikari.
Finally, I got it working by just using Basic DataSource and set auto-commit to False. BasicDataSource is not suitable for production though, would there be a new driver release that can handle this better?
03-03-2022 11:19 AM
03-09-2022 06:09 PM
This error is mentioned in Spark documentation - https://spark.apache.org/docs/latest/, looks like this is specific to the version of Java and can be avoid by having the mentioned properties set
03-22-2022 09:50 AM
Hi @Tony Zhou ,
Just a friendly follow-up. Did @Kaniz Fatma 's response helped you to resolve this issue? if not, please share more details, like the full error stack trace and some code snippets.
03-23-2022 05:19 PM
Hi @Jose Gonzalez ,
This similar issue in snowflake in JDBC is a good reference, I was able to get this to work in Java OpenJDK 17 by having this JVM option specified:
--add-opens=java.base/java.nio=ALL-UNNAMED
Although I came across another issue with using apache DHCP to connect to Databricks SQL endpoint:
Caused by: java.sql.SQLFeatureNotSupportedException: [Simba][JDBC](10220) Driver does not support this optional feature.
at com.simba.spark.exceptions.ExceptionConverter.toSQLException(Unknown Source)
at com.simba.spark.jdbc.common.SConnection.setAutoCommit(Unknown Source)
at com.simba.spark.jdbc.jdbc42.DSS42Connection.setAutoCommit(Unknown Source)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
The same problem occurred after I switched to Hikari.
Finally, I got it working by just using Basic DataSource and set auto-commit to False. BasicDataSource is not suitable for production though, would there be a new driver release that can handle this better?
03-24-2022 08:58 AM
Thanks a lot @Alice Hung your suggestion works. I am really grateful to you for sharing it. There is absolutely no help available elsewhere.
03-24-2022 09:47 AM
Definitely, I would want to. But I can't find an option to mark it as the best.
03-24-2022 10:07 AM
I am sorry @Kaniz Fatma but I don't see that option available to me. If you see it, kindly use it on my behalf.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group