My Mongo Atlas connect url is like mongodb+srv://<srv_hostname>
I don't want to use direct url like mongodb://<hostname1, hostname2, hostname3....> because our Mongo Atlas global clusters have many hosts. It would be hard to maintain.
Our java programs in our GCP cloud connecting to the same Mongo Atlas SRV do not have this issue. I suspect something wrong with our Databricks or pyspark configs or driver versions compatibility. Please help.
level: ERROR
message: Exception occured: %s Traceback (most recent call last):
File "<command-389574246471033>", line 9, in <module>
if df.count() != 0:
File "/databricks/spark/python/pyspark/sql/dataframe.py", line 688, in count
return int(self._jdf.count())
File "/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py", line 1304, in __call__
return_value = get_return_value(
File "/databricks/spark/python/pyspark/sql/utils.py", line 117, in deco
return f(*a, **kw)
File "/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o468.count.
: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches com.mongodb.client.internal.MongoClientDelegate$1@43eaa290. Client view of cluster state is {type=REPLICA_SET, srvResolutionException=com.mongodb.MongoConfigurationException: No SRV records available for _mongodb._tcp.mongo-core-coreapp-clus-pri.tpl3f.mongodb.net, servers=[]
at com.mongodb.internal.connection.BaseCluster.createTimeoutException(BaseCluster.java:424)