10-06-2022 03:44 AM
Hi,
I am using databricks through azure. I am trying to connect to remote oracle database using jdbc url. I am getting an error of no suitable driver found.
"java.sql.SQLException: No suitable driver"
can somebody help mw with this?
10-06-2022 11:07 PM
Hi, @vikas k , You have to load the driver class.
Fo example:
Class.forName("com.mysql.jdbc.Driver");
10-07-2022 02:34 AM
Class not defined error
10-07-2022 07:09 AM
Hi @vikas k, This article covers how to use the DataFrame API to connect to SQL databases using JDBC and how to control the parallelism of reads through the JDBC interface. This article provides detailed examples using the Scala API, with abbreviated Python and Spark SQL examples at the end. For all of the supported arguments for connecting to SQL databases using JDBC, see JDBC To Other Databases.
10-09-2022 10:39 PM
Hi @vikas k, We haven’t heard from you on the last response from me, and I was checking back to see if you have a resolution yet.
If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
10-10-2022 04:13 AM
the problem still stays. it doesn't speak of the driver installation on databricks
10-14-2022 04:40 AM
When you query JDBC, which is not MSSQL, you need to install and set the driver option.
So you need to get the JDBC driver for Oracle and install it first.
https://www.oracle.com/database/technologies/maven-central-guide.html
you need to know which one (from administrator or support=, and then in cluster config, you can install it from Maven.
driver = "your_driver_name"
remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
...
10-25-2022 02:57 PM
Hi @vikas k, We haven’t heard from you since the last response from @Hubert Dudek , and I was checking back to see if you have a resolution yet.
If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
11-15-2022 12:47 AM
Hi @vikas k
Hope all is well!
Does @Hubert Dudek response were able to resolve your issue, and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.