โ10-06-2022 03:44 AM
Hi,
I am using databricks through azure. I am trying to connect to remote oracle database using jdbc url. I am getting an error of no suitable driver found.
"java.sql.SQLException: No suitable driver"
can somebody help mw with this?
โ10-06-2022 11:07 PM
Hi, @vikas kโ , You have to load the driver class.
Fo example:
Class.forName("com.mysql.jdbc.Driver");
โ10-07-2022 02:34 AM
Class not defined error
โ10-10-2022 04:13 AM
the problem still stays. it doesn't speak of the driver installation on databricks
โ10-14-2022 04:40 AM
When you query JDBC, which is not MSSQL, you need to install and set the driver option.
So you need to get the JDBC driver for Oracle and install it first.
https://www.oracle.com/database/technologies/maven-central-guide.html
you need to know which one (from administrator or support=, and then in cluster config, you can install it from Maven.
driver = "your_driver_name"
remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
...
โ11-15-2022 12:47 AM
Hi @vikas kโ
Hope all is well!
Does @Hubert Dudekโ response were able to resolve your issue, and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now