Connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-27-2025 08:10 AM - edited 02-27-2025 08:16 AM
Hi.
I'm testing a databricks connection to a mongo cluster V7 (azure cluster) using the library org.mongodb.spark:mongo-spark-connector_2.13:10.4.1
I can connect using compass but I get a timeout error using my adb notebook
MongoTimeoutException: Timed out while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused}}]
By the way I telnet the server with success (%sh telnet.....)
Any ideas?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-28-2025 06:32 AM
any help?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Hi. Not a solution I'm afraid, but I'm having the exact same issue. Did you manage to resolve at all?
What is throwing me is that I'm configuring the IP for the MongoDB instance as its running in AWS on an EC2 instance, but I still see 'localhost' on the error message as if it's ignoring my configuration. Is this similar to what you're seeing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago - last edited a month ago
Hi.
Yes . Same. I see localhost.
My cluster is deployed on azure kubernetes. I can connect using pymongo and also compass.
I've tested using a free atlas cluster and worked as well (I changed the atlas firewall rule to enabled my databricks workspace)
No clues

