<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic MongoDB Spark Connection Issues in Warehousing &amp; Analytics</title>
    <link>https://community.databricks.com/t5/warehousing-analytics/mongodb-spark-connection-issues/m-p/111605#M1915</link>
    <description>&lt;P&gt;Hi. I have a local MongoDB running on an EC2 instance in the same AWS VPC as my Databricks cluster but cannot get Databricks to talk to MongoDB.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I've followed the guide at&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/connect/external-systems/mongodb" target="_blank" rel="noopener"&gt;https://docs.databricks.com/aws/en/connect/external-systems/mongodb&lt;/A&gt;&amp;nbsp;and have also reviewed the MongoDB guidance at&amp;nbsp;&lt;A href="https://www.mongodb.com/docs/spark-connector/current/getting-started/" target="_blank" rel="noopener"&gt;https://www.mongodb.com/docs/spark-connector/current/getting-started/&lt;/A&gt;&amp;nbsp;but to no avail.&lt;BR /&gt;&lt;BR /&gt;I've attempted adding the MongoDB configuration to the cluster Spark configuration, and configuring locally within the Notebook.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;EM&gt;from pyspark.sql import SparkSession&lt;/EM&gt;&lt;/DIV&gt;&lt;BR /&gt;&lt;DIV&gt;&lt;EM&gt;my_spark = SparkSession \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.builder \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.appName("myApp") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.config("spark.mongodb.read.connection.uri", "mongodb://x.x.x.x:27017/") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.config("spark.mongodb.write.connection.uri", "mongodb://x.x.x.x:27017/") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.getOrCreate()&lt;BR /&gt;&lt;BR /&gt;&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;EM&gt;database = "mydatabase"&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;collection = "mycollection"&lt;/EM&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;EM&gt;df = my_spark.read.format("mongodb") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.option("database", database) \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.option("collection", collection) \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.load()&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;However, on each run, I get the following error regardless of how I configure things:&lt;BR /&gt;&lt;P class=""&gt;&lt;EM&gt;(com.mongodb.MongoTimeoutException) Timed out while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]&lt;/EM&gt;&lt;/P&gt;&lt;P class=""&gt;I've verified connectivity with the EC2 host that is running the MongoDB instance, but from the error, it looks like it is attempting to connect to localhost:27017, rather than the IP I've configured. Is this just a bogus error or am I missing something in the config?&lt;/P&gt;&lt;P class=""&gt;I'm out of ideas so looking for some help/guidance. Thanks!&lt;/P&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
    <pubDate>Mon, 03 Mar 2025 15:46:08 GMT</pubDate>
    <dc:creator>Kirki</dc:creator>
    <dc:date>2025-03-03T15:46:08Z</dc:date>
    <item>
      <title>MongoDB Spark Connection Issues</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/mongodb-spark-connection-issues/m-p/111605#M1915</link>
      <description>&lt;P&gt;Hi. I have a local MongoDB running on an EC2 instance in the same AWS VPC as my Databricks cluster but cannot get Databricks to talk to MongoDB.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I've followed the guide at&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/connect/external-systems/mongodb" target="_blank" rel="noopener"&gt;https://docs.databricks.com/aws/en/connect/external-systems/mongodb&lt;/A&gt;&amp;nbsp;and have also reviewed the MongoDB guidance at&amp;nbsp;&lt;A href="https://www.mongodb.com/docs/spark-connector/current/getting-started/" target="_blank" rel="noopener"&gt;https://www.mongodb.com/docs/spark-connector/current/getting-started/&lt;/A&gt;&amp;nbsp;but to no avail.&lt;BR /&gt;&lt;BR /&gt;I've attempted adding the MongoDB configuration to the cluster Spark configuration, and configuring locally within the Notebook.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;EM&gt;from pyspark.sql import SparkSession&lt;/EM&gt;&lt;/DIV&gt;&lt;BR /&gt;&lt;DIV&gt;&lt;EM&gt;my_spark = SparkSession \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.builder \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.appName("myApp") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.config("spark.mongodb.read.connection.uri", "mongodb://x.x.x.x:27017/") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.config("spark.mongodb.write.connection.uri", "mongodb://x.x.x.x:27017/") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.getOrCreate()&lt;BR /&gt;&lt;BR /&gt;&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;EM&gt;database = "mydatabase"&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;collection = "mycollection"&lt;/EM&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;EM&gt;df = my_spark.read.format("mongodb") \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.option("database", database) \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.option("collection", collection) \&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;EM&gt;.load()&lt;/EM&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;However, on each run, I get the following error regardless of how I configure things:&lt;BR /&gt;&lt;P class=""&gt;&lt;EM&gt;(com.mongodb.MongoTimeoutException) Timed out while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]&lt;/EM&gt;&lt;/P&gt;&lt;P class=""&gt;I've verified connectivity with the EC2 host that is running the MongoDB instance, but from the error, it looks like it is attempting to connect to localhost:27017, rather than the IP I've configured. Is this just a bogus error or am I missing something in the config?&lt;/P&gt;&lt;P class=""&gt;I'm out of ideas so looking for some help/guidance. Thanks!&lt;/P&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Mon, 03 Mar 2025 15:46:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/mongodb-spark-connection-issues/m-p/111605#M1915</guid>
      <dc:creator>Kirki</dc:creator>
      <dc:date>2025-03-03T15:46:08Z</dc:date>
    </item>
    <item>
      <title>Re: MongoDB Spark Connection Issues</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/mongodb-spark-connection-issues/m-p/120478#M2083</link>
      <description>&lt;P&gt;Hey&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/151727"&gt;@Kirki&lt;/a&gt;&amp;nbsp;maybe its late but I will try to help you or others to create these connections&lt;BR /&gt;&lt;BR /&gt;First thing make sure you have installed inside your cluster the connector&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;org.mongodb.spark:mongo-spark-connector_2.12:3.0.1&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;You can use directly in your spark.read sentence like that&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;df = spark.read.format("mongodb") \
    .option("uri", "mongodb://&amp;lt;your-ec2-ip&amp;gt;:27017/") \
    .option("database", "mydatabase") \
    .option("collection", "mycollection") \
    .load()&lt;/LI-CODE&gt;&lt;P&gt;or&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;df = spark.read.format("mongodb") \
    .option("spark.mongodb.input.uri", "mongodb://&amp;lt;your-ec2-ip&amp;gt;:27017/mydatabase.mycollection") \
    .load()&lt;/LI-CODE&gt;&lt;P class=""&gt;&lt;STRONG&gt;Set it in the Spark config of your cluster:&lt;/STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark.mongodb.input.uri mongodb+srv://&amp;lt;user&amp;gt;:&amp;lt;pass&amp;gt;@&amp;lt;cluster&amp;gt;/
spark.mongodb.output.uri mongodb+srv://&amp;lt;user&amp;gt;:&amp;lt;pass&amp;gt;@&amp;lt;cluster&amp;gt;/&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P class=""&gt;&lt;STRONG&gt;Try connecting directly from a Databricks notebook&lt;/STRONG&gt;&lt;SPAN class=""&gt;:&lt;/SPAN&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;%sh
nc -zv &amp;lt;ec2-ip&amp;gt; 27017&lt;/LI-CODE&gt;&lt;P class=""&gt;&lt;BR /&gt;At the cluster is an EC2 so:&lt;BR /&gt;Make sure MongoDB listens on &lt;SPAN class=""&gt;0.0.0.0&lt;/SPAN&gt; and your EC2 allows inbound connections on port 27017&lt;BR /&gt;&lt;BR /&gt;Hope this helps, &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;BR /&gt;&lt;BR /&gt;Isi&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 28 May 2025 21:21:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/mongodb-spark-connection-issues/m-p/120478#M2083</guid>
      <dc:creator>Isi</dc:creator>
      <dc:date>2025-05-28T21:21:03Z</dc:date>
    </item>
  </channel>
</rss>

