โ03-02-2022 08:14 AM
I have a Java program like this to test out the Databricks JDBC connection with the Databricks JDBC driver.
Connection connection = null;
try {
Class.forName(driver);
connection = DriverManager.getConnection(url, username, password);
if (connection != null) {
System.out.println("Connection Established");
} else {
System.out.println("Connection Failed");
}
Statement statement = connection.createStatement();
ResultSet rs = statement.executeQuery("select * from standard_info_service.daily_transactions");
while (rs.next()) {
System.out.print("created_date: " + rs.getInt("created_date") + ", ");
System.out.println("daily_transactions: " + rs.getInt("daily_transactions"));
}
} catch (Exception e) {
System.out.println(e);
}
This program, however, throws an error like this:
Connection Established
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
java.sql.SQLException: [Simba][SparkJDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
What will be the solution?
โ03-23-2022 05:19 PM
Hi @Jose Gonzalezโ ,
This similar issue in snowflake in JDBC is a good reference, I was able to get this to work in Java OpenJDK 17 by having this JVM option specified:
--add-opens=java.base/java.nio=ALL-UNNAMED
Although I came across another issue with using apache DHCP to connect to Databricks SQL endpoint:
Caused by: java.sql.SQLFeatureNotSupportedException: [Simba][JDBC](10220) Driver does not support this optional feature.
at com.simba.spark.exceptions.ExceptionConverter.toSQLException(Unknown Source)
at com.simba.spark.jdbc.common.SConnection.setAutoCommit(Unknown Source)
at com.simba.spark.jdbc.jdbc42.DSS42Connection.setAutoCommit(Unknown Source)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
The same problem occurred after I switched to Hikari.
Finally, I got it working by just using Basic DataSource and set auto-commit to False. BasicDataSource is not suitable for production though, would there be a new driver release that can handle this better?
โ03-03-2022 11:19 AM
โ03-09-2022 06:09 PM
This error is mentioned in Spark documentation - https://spark.apache.org/docs/latest/, looks like this is specific to the version of Java and can be avoid by having the mentioned properties set
โ03-22-2022 09:50 AM
Hi @Tony Zhouโ ,
Just a friendly follow-up. Did @Kaniz Fatmaโ 's response helped you to resolve this issue? if not, please share more details, like the full error stack trace and some code snippets.
โ03-23-2022 05:19 PM
Hi @Jose Gonzalezโ ,
This similar issue in snowflake in JDBC is a good reference, I was able to get this to work in Java OpenJDK 17 by having this JVM option specified:
--add-opens=java.base/java.nio=ALL-UNNAMED
Although I came across another issue with using apache DHCP to connect to Databricks SQL endpoint:
Caused by: java.sql.SQLFeatureNotSupportedException: [Simba][JDBC](10220) Driver does not support this optional feature.
at com.simba.spark.exceptions.ExceptionConverter.toSQLException(Unknown Source)
at com.simba.spark.jdbc.common.SConnection.setAutoCommit(Unknown Source)
at com.simba.spark.jdbc.jdbc42.DSS42Connection.setAutoCommit(Unknown Source)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
at org.apache.commons.dbcp2.DelegatingConnection.setAutoCommit(DelegatingConnection.java:801)
The same problem occurred after I switched to Hikari.
Finally, I got it working by just using Basic DataSource and set auto-commit to False. BasicDataSource is not suitable for production though, would there be a new driver release that can handle this better?
โ03-24-2022 08:58 AM
Thanks a lot @Alice Hungโ your suggestion works. I am really grateful to you for sharing it. There is absolutely no help available elsewhere.
โ03-24-2022 09:47 AM
Definitely, I would want to. But I can't find an option to mark it as the best.
โ03-24-2022 10:07 AM
I am sorry @Kaniz Fatmaโ but I don't see that option available to me. If you see it, kindly use it on my behalf.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now