- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-11-2021 05:02 PM
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-23-2021 11:25 PM
Hi @Vamsee krishna kanth Arcot Yes, currently you will have to download the JDBC from https://databricks.com/spark/jdbc-drivers-download and connect from other applications with JDBC URL just like you mentioned in your example.
There is an internal request (DB-I-4081 Publish databricks JDBC driver to maven central). This request shall be upvoted if you own a databricks ideas portal login. The upvotes would accelerate the priority of this request.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-11-2021 11:14 PM
Hi @Vamsee krishna kanth Arcot Databricks classloader does not support loading classes from the Spring Boot uber jar.
There is a workaround that involves
- Using an init script, you can replace the default spring jar installed in databricks cluster with required version. Follow the steps mentioned in the KB: https://docs.microsoft.com/en-us/azure/databricks/kb/libraries/replace-default-jar-new-jar
- Use the maven-shade-plugin to build the jar
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2021 08:19 AM
Hi @Vamsee krishna kanth Arcot did the workaround help you resolve the issue?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2021 09:24 AM
Hi @Prabakar Ammeappin ,
I didnot understand what are u referring to .
anyways I could download the jar SparkJDBC42.jar and include it in my spring boot application in IntelliJ IDE.
also I am trying to connect using the below code.
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.simba.spark.jdbc.Driver");
I am yet to test it, but my question is since the JAR is NOT available through maven, what is the alternative ?.
Do we need to manually install it on every server irrespective of whether it is dev/qa/prod.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-23-2021 11:25 PM
Hi @Vamsee krishna kanth Arcot Yes, currently you will have to download the JDBC from https://databricks.com/spark/jdbc-drivers-download and connect from other applications with JDBC URL just like you mentioned in your example.
There is an internal request (DB-I-4081 Publish databricks JDBC driver to maven central). This request shall be upvoted if you own a databricks ideas portal login. The upvotes would accelerate the priority of this request.

