Hi vidya. I have the same problem.

I can connect using pymongo  and compass . I installed the library org.mongodb.spark:mongo-spark-connector_2.13:10.4.1 (latest one) on my cluster using the runtime 16.2 but I never was able to connect to same mongo cluster (sharded) using the primary as default.

This is the scala code (I've tested in python as well)

val connstr = "mongodb://user:xxxxxxx@cluster/dbxxx?tls=true&tlsInsecure=true&authSource=admin"

val df = spark.read.format("mongodb")
.option("database", "dbdbdbdbdb")
.option("spark.mongodb.read.connection.uri", connstr)
.option("collection", "cccccccccc")
.load().limit(5)
 
Also I can telnet the cluster successfully .
 
Any clues?