cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x

RobsonNLPT
Contributor III

Hi.

I'm testing a databricks connection to a mongo cluster V7 (azure cluster) using the library org.mongodb.spark:mongo-spark-connector_2.13:10.4.1

I can connect using compass but I get a timeout error using my adb notebook

MongoTimeoutException: Timed out while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused}}]

By the way I telnet the server with success (%sh telnet.....)

Any ideas?

3 REPLIES 3

RobsonNLPT
Contributor III

any help?

Kirki
New Contributor II

Hi. Not a solution I'm afraid, but I'm having the exact same issue. Did you manage to resolve at all? 

What is throwing me is that I'm configuring the IP for the MongoDB instance as its running in AWS on an EC2 instance, but I still see 'localhost' on the error message as if it's ignoring my configuration. Is this similar to what you're seeing?

Hi.

 

Yes . Same. I see localhost.

My cluster is deployed on azure kubernetes. I can connect using pymongo and also compass.

I've tested using a free atlas cluster and worked as well (I changed the atlas firewall rule to enabled my databricks workspace)

No clues

 

 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now