Can't connect to Atlas

attie_bc
New Contributor III

I have this:

connectionString='mongodb+srv://user:pw@something.jghu.mongodb.net/?retryWrites=true&w=majority&appName=dbricks&tls=true'
database='dealflow'
collection='activities'

frame = spark.read.format("mongodb") \
    .option("spark.mongodb.read.connection.uri", connectionString) \
    .option("spark.mongodb.read.database", database) \
    .option("spark.mongodb.read.collection", collection) \
    .option("spark.mongodb.read.readPreference.name", "primaryPreferred") \
    .load()

display(frame)

that times out.

When I telnet I get a "server lookup failure". This only happens on Databricks.

Isi
Honored Contributor III

hey @attie_bc 

I guess you are using All-pourpouse cluster, have you tried

curl https://www.google.com

Maybe your cluster doesn’t have internet access? If that’s the case, DNS resolution for your MongoDB Atlas SRV connection string will fail, which would explain the “server lookup failure” you’re seeing.

If the curl fails too, then your cluster likely doesn’t have outbound internet access, which is required to resolve SRV records like *.mongodb.net.

Also, two other things worth checking:

MongoDB Atlas IP Access List
Even if DNS resolves correctly, Atlas will block the connection unless your Databricks cluster’s IP is whitelisted in the Network Access section. Docs

SRV Connection String Dependency on DNS
The SRV connection string (mongodb+srv://...) depends on resolving DNS SRV records. In some restricted environments (like private VPCs), this won’t work without proper DNS forwarding. Docs  

Hope this helps 🙂

Isi

attie_bc
New Contributor III

Thank you. The issue in the end was that Amazon VPC blocked the 27017 port. I had to add an outbound rule on a Security Group to allow access. That solved it.

View solution in original post