cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Mongo Db connector - Connection timeout when trying to connect to AWS Document DB

rijin-thomas
Visitor

I am on Databricks Run Time LTE 14.3 Spark 3.5.0 Scala 2.12 and mongodb-spark-connector_2.12:10.2.0. 

Trying to connect to Document DB using the connector and all I get is a connection timeout. I tried using PyMongo, which works as expected and I can read from the db. I have a CA_File that I'm passing as an argument and it's stored in Unity Catalog. 

Code:

CONNECTION_URI = f"mongodb://{USERNAME}:{PASSWORD}@{ENDPOINT}:27017/{DATABASE_NAME}?replicaSet=rs0&readPreference=secondaryPreferred"

df = spark.read.format("mongodb") \
     .option("spark.mongodb.read.connection.uri", CONNECTION_URI) \
     .option("collection", COLLECTION) \
     .option("ssl", "true") \
     .option("ssl.CAFile", DBFS_CA_FILE) \
     .option("ssl.enabledProtocols", "TLSv1.2") \
     .load()

 Connector error:

SparkConnectGrpcException: (com.mongodb.MongoTimeoutException) Timed out after 30000 ms while waiting for a server that matches com.mongodb.client.internal.MongoClientDelegate. Client view of state is {type=REPLICA_SET, servers=[{address=<endpoint>, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketReadTimeoutException: Timeout while receiving message}, caused by {java.net.SocketTimeoutException: Read timed out}}]
0 REPLIES 0