cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

remote database connection error

aarave
New Contributor III

Hi,

I am using databricks through azure. I am trying to connect to remote oracle database using jdbc url. I am getting an error of no suitable driver found.

"java.sql.SQLException: No suitable driver"

can somebody help mw with this?

8 REPLIES 8

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, @vikas k​ , You have to load the driver class.

Fo example:

Class.forName("com.mysql.jdbc.Driver");

aarave
New Contributor III

Class not defined error

Kaniz
Community Manager
Community Manager

Hi @vikas k​, This article covers how to use the DataFrame API to connect to SQL databases using JDBC and how to control the parallelism of reads through the JDBC interface. This article provides detailed examples using the Scala API, with abbreviated Python and Spark SQL examples at the end. For all of the supported arguments for connecting to SQL databases using JDBC, see JDBC To Other Databases.

Kaniz
Community Manager
Community Manager

Hi @vikas k​, We haven’t heard from you on the last response from me, and I was checking back to see if you have a resolution yet.

If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

aarave
New Contributor III

the problem still stays. it doesn't speak of the driver installation on databricks

Hubert-Dudek
Esteemed Contributor III

When you query JDBC, which is not MSSQL, you need to install and set the driver option.

So you need to get the JDBC driver for Oracle and install it first.

https://www.oracle.com/database/technologies/maven-central-guide.html

you need to know which one (from administrator or support=, and then in cluster config, you can install it from Maven.

driver = "your_driver_name"
 
remote_table = (spark.read
  .format("jdbc")
  .option("driver", driver)
...

Kaniz
Community Manager
Community Manager

Hi @vikas k​​, We haven’t heard from you since the last response from @Hubert Dudek​ , and I was checking back to see if you have a resolution yet.

If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Anonymous
Not applicable

Hi @vikas k​ 

Hope all is well!

Does @Hubert Dudek​  response were able to resolve your issue, and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.