03-18-2024 03:31 AM
facing issue with integrating our Spring boot JPA supported application with Databricks.
Below are the steps and setting we did for the integration.
When we are starting the spring boot application we are getting a warning as :
HikariPool-1 - Driver does not support get/set network timeout for connections. ([Databricks][JDBC](10220) Driver does not support this optional feature.)
and the application gets started.
When interacting with DB for any CRUD operation we are getting exception : Caused by: java. SQL. SQL Feature Not Supported Exception: [Databricks][JDBC](10220) Driver does not support this optional feature.
Driver used: com. databricks .client. jdbc. Driver
Url: jdbc:databricks://************************ t:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/778046939806498/0307-074105-********;AuthMech=3;UID=token;PWD=*****************
Dialect : org. hibernate. dialect. My SQL Dialect
Below are other drivers that we tried as well suggested in databricks -jdbc-driver-install-and-configuration-guide:
com.databricks.client.jdbc.DataSource com.databricks.client.jdbc42.Driver com.databricks.client.jdbc42.DataSource
Also we would like to highlight that we are able to make single connection and able to perform CRUD operation with custom queries but not with Spring Data Jpa.
03-20-2024 04:39 AM
Hi Kaniz,
I have followed the steps mentioned above but it didn't solve the problem. Now, I can retrieve the data using JPA but the transaction methods like save, and save All are failing. It is used to do the Insert and Update query using JPA. However I was able to run the explicit Insert and Update query. Now getting the below error.
org.springframework.dao.InvalidDataAccessApiUsageException: No EntityManager with actual transaction available for current thread - cannot reliably process 'persist' call; nested exception is javax.persistence.TransactionRequiredException: No EntityManager with actual transaction available for current thread - cannot reliably process 'persist' call
I wasn't able to switch the auto-commit to false
spring.datasource.hikari.auto-commit=false
It was throwing the below error when i tried to switch the auto-commit to False. I tried it in the properties files as well while making the connection.
java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.
It would highly appreciated if you can help me to make a connection with Java Spring Boot JPA with the Delta lake.
03-27-2024 10:29 PM
@satishnavik - The error message indicates that Spring JPA is unable to manage transactions with Delta Lake because the Delta Lake JDBC driver doesn't support them.
The Issue:
Possible Solutions:
Since JPA transactions won't work with Delta Lake, here are alternative approaches:
Native Delta Lake API:
Separate Layers:
Spring Data JPA with another Datasource:
Choosing the Right Approach:
The best approach depends on your specific needs. Here's a breakdown:
06-06-2024 06:55 AM
@satishnavik I am also facing the same issue with my Spring app. Can I ask what version of the databricks jdbc and springboot you are using?
My app starts, but I get a series of errors stemming from log4j
ERROR StatusLogger Unable to create Lookup for bundle
java.lang.ClassCastException: class org.apache.logging.log4j.core.lookup.ResourceBundleLookup
at java.base/java.lang.Class.asSubclass(Class.java:3924)
at com.databricks.client.jdbc42.internal.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:84)
at com.databricks.client.jdbc42.internal.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:105)[...]
ERROR StatusLogger Unable to create Lookup for ctx
java.lang.ClassCastException: class org.apache.logging.log4j.core.lookup.ContextMapLookup
at java.base/java.lang.Class.asSubclass(Class.java:3924)
at com.databricks.client.jdbc42.internal.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:84)
at com.databricks.client.jdbc42.internal.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:105)
04-03-2024 12:19 AM
Thanks @SanjayTS for the response.
Unfortunately, none of the approach seems very promising given the dependencies and efforts requires to makes these changes.
in the current scenario, we are looking for some ready to use drivers or options to minimize the new efforts.
We have already tried multiple versions of databricks jdbc driver and also, we tried setting auto-commit flag to false but databricks driver does not support this feature at present.
Separate data layer also is not an option as it doesn't support update / delete operations apart from additional overhead.
Spring Data JPA with another Datasource: This approach is not approved by architecture team as this results in additional cost and duplication of data.
Native Delta Lake API:
We are exploring integration with the Native Delta Lake API approach, but this approach requires significant amount of code changes at repository layer.
Using Jdbc template approach we are able to create test connection and interact with databricks delta lake tables by modifying existing jpa code into native queries. There are issues related to spring boot security and sessions that results in connection issues and as well results in significant performance degradation.
05-10-2024 12:05 PM
Was there any resolution to this? Is Spring datasource supported now?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group