- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
I followed the official Databricks documentation("https://docs.databricks.com/en/_extras/notebooks/source/mongodb.html")
to integrate MongoDB Atlas with Spark by setting up the MongoDB Spark Connector and configuring the connection string in my Databricks cluster. However, I am encountering issues when trying to read data from MongoDB using Spark.
While I can successfully connect to MongoDB using the MongoClient in Python and execute queries like
I am unable to load data using the Spark connector with the following code:
The connection string is the same in both cases, and I have confirmed that the necessary permissions and IP whitelisting are correctly configured in MongoDB Atlas.
Despite this, no data is being retrieved when using Spark, and I’m unable to identify the issue.
also, I attached error screenshot below.
Can anyone provide guidance on potential configuration issues or additional steps needed to troubleshoot this problem with the MongoDB Spark connector in Databricks?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
Hi @vidya_kothavale ,
Could you try to change "spark.mongodb.input.uri" to following?
spark.read.format("mongodb").option("spark.mongodb.read.connection.uri"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
Hi @vidya_kothavale ,
Could you try to change "spark.mongodb.input.uri" to following?
spark.read.format("mongodb").option("spark.mongodb.read.connection.uri"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
Thanks! @szymon_dybczak It's working perfectly now.