<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14874#M9290</link>
    <description>&lt;P&gt;Hello Abel, I´m facing a similar problem. What was the option you've used for the 10.0.5?&lt;/P&gt;</description>
    <pubDate>Tue, 11 Apr 2023 21:56:38 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2023-04-11T21:56:38Z</dc:date>
    <item>
      <title>Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14866#M9282</link>
      <description>&lt;P&gt;I'm able to connect to MongoDB using org.mongodb.spark:mongo-spark-connector_2.12:3.0.2 and this code:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;df = spark.read.format("com.mongodb.spark.sql.DefaultSource").option("uri", jdbcUrl)&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;It works well, but if I install  last MongoDB Spark Connector version 10.0.5 and I try to connect using:&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;df = spark.read.format("mongodb").option("spark.mongodb.input.uri",jdbcUrl)&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;In that case MongoDB server returns connection timeout. Any ideas? I need to work with 10x libraries because they allows structured streaming from mongoDB. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 23 Dec 2022 15:44:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14866#M9282</guid>
      <dc:creator>Abel_Martinez</dc:creator>
      <dc:date>2022-12-23T15:44:50Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14867#M9283</link>
      <description>&lt;P&gt;Please run magic command %sh in the notebook and check if you have routing there (as it can be a network issue)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;%sh
telenet  &amp;lt;mongodb_server_name&amp;gt; 28017&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 23 Dec 2022 16:41:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14867#M9283</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2022-12-23T16:41:13Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14868#M9284</link>
      <description>&lt;P&gt;Hi @Abel Martinez​&amp;nbsp;&lt;/P&gt;&lt;P&gt;Validate the connection between cluster and MongoDB server. Run a nc to the URL and see if the connection is working for MongoDB port (27017) ?&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;%sh nc -zv &amp;lt;url&amp;gt; &amp;lt;port&amp;gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;PRE&gt;&lt;CODE&gt;%sh curl -vvv URL&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;If there are no issues on the networking side please share the complete error message to check further.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 23 Dec 2022 17:27:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14868#M9284</guid>
      <dc:creator>Vivian_Wilfred</dc:creator>
      <dc:date>2022-12-23T17:27:51Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14869#M9285</link>
      <description>&lt;P&gt;Please check also certificate issues &lt;A href="https://gitlab.besedo.com/operation-tools/ebay-social-media-report/-/blob/master/classes/mailbox.py" target="test_blank"&gt;https://gitlab.besedo.com/operation-tools/ebay-social-media-report/-/blob/master/classes/mailbox.py&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 23 Dec 2022 22:07:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14869#M9285</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2022-12-23T22:07:25Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14870#M9286</link>
      <description>&lt;P&gt;there can be 3 things&lt;/P&gt;&lt;P&gt;1-Your firewall is blocking its IP&lt;/P&gt;&lt;P&gt;2-Your Logs are getting huge Traffic &lt;/P&gt;&lt;P&gt;3-This can be databricks deployment in your workspace&lt;/P&gt;</description>
      <pubDate>Sat, 24 Dec 2022 04:33:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14870#M9286</guid>
      <dc:creator>Aviral-Bhardwaj</dc:creator>
      <dc:date>2022-12-24T04:33:00Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14871#M9287</link>
      <description>&lt;P&gt;Do you see any error message in your driver logs? if you do, please share the information here.&lt;/P&gt;</description>
      <pubDate>Tue, 27 Dec 2022 23:08:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14871#M9287</guid>
      <dc:creator>jose_gonzalez</dc:creator>
      <dc:date>2022-12-27T23:08:25Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14872#M9288</link>
      <description>&lt;P&gt;Hi all, thanks for your answers. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;This is not a connection issue as the same code, with the same IPs and ports is working using &lt;I&gt;format("com.mongodb.spark.sql.DefaultSource")&lt;/I&gt; but it doesn't work using &lt;I&gt;format("mongodb")&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here you have the error message in the driver logs...&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;23/01/09 10:01:19 INFO cluster: Exception in monitor thread while connecting to server localhost:27017
com.mongodb.MongoSocketOpenException: Exception opening socket
	at com.mongodb.internal.connection.SocketStream.open(SocketStream.java:70)
	at com.mongodb.internal.connection.InternalStreamConnection.open(InternalStreamConnection.java:180)
	at com.mongodb.internal.connection.DefaultServerMonitor$ServerMonitorRunnable.lookupServerDescription(DefaultServerMonitor.java:188)
	at com.mongodb.internal.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:152)
	at java.lang.Thread.run(Thread.java:750)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:613)
	at com.mongodb.internal.connection.SocketStreamHelper.initialize(SocketStreamHelper.java:107)
	at com.mongodb.internal.connection.SocketStream.initializeSocket(SocketStream.java:79)
	at com.mongodb.internal.connection.SocketStream.open(SocketStream.java:65)
	... 4 more
23/01/09 10:01:49 WARN JupyterDriverLocal: User code returned error with traceback: [0;31m---------------------------------------------------------------------------[0m
[0;31mPy4JJavaError[0m                             Traceback (most recent call last)
[0;32m&amp;lt;command-2679801502558508&amp;gt;[0m in [0;36m&amp;lt;cell line: 3&amp;gt;[0;34m()[0m
[1;32m      2[0m [0;34m[0m[0m
[1;32m      3[0m [0;32mfor[0m [0mrow[0m [0;32min[0m [0msqlDF[0m[0;34m.[0m[0mcollect[0m[0;34m([0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[0;32m----&amp;gt; 4[0;31m     [0mgetData[0m[0;34m([0m[0mrow[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[0m
[0;32m&amp;lt;command-2679801502558507&amp;gt;[0m in [0;36mgetData[0;34m(row)[0m
[1;32m     43[0m [0;34m[0m[0m
[1;32m     44[0m [0;34m[0m[0m
[0;32m---&amp;gt; 45[0;31m     [0mdf[0m [0;34m=[0m [0mspark[0m[0;34m.[0m[0mread[0m[0;34m.[0m[0mformat[0m[0;34m([0m[0;34m"mongodb"[0m[0;34m)[0m[0;34m.[0m[0moption[0m[0;34m([0m[0;34m"spark.mongodb.input.uri"[0m[0;34m,[0m[0mjdbcUrl[0m[0;34m)[0m[0;31m [0m[0;31m\[0m[0;34m[0m[0;34m[0m[0m
[0m[1;32m     46[0m                                      [0;34m.[0m[0moption[0m[0;34m([0m[0;34m"database"[0m[0;34m,[0m [0mrow[0m[0;34m[[0m[0;34m'source_schema'[0m[0;34m][0m[0;34m)[0m[0;31m [0m[0;31m\[0m[0;34m[0m[0;34m[0m[0m
[1;32m     47[0m                                      [0;34m.[0m[0moption[0m[0;34m([0m[0;34m"collection"[0m[0;34m,[0m [0mrow[0m[0;34m[[0m[0;34m'source_table'[0m[0;34m][0m[0;34m)[0m[0;34m.[0m[0mload[0m[0;34m([0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
&amp;nbsp;
[0;32m/databricks/spark/python/pyspark/instrumentation_utils.py[0m in [0;36mwrapper[0;34m(*args, **kwargs)[0m
[1;32m     46[0m             [0mstart[0m [0;34m=[0m [0mtime[0m[0;34m.[0m[0mperf_counter[0m[0;34m([0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[1;32m     47[0m             [0;32mtry[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[0;32m---&amp;gt; 48[0;31m                 [0mres[0m [0;34m=[0m [0mfunc[0m[0;34m([0m[0;34m*[0m[0margs[0m[0;34m,[0m [0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[0m[1;32m     49[0m                 logger.log_success(
[1;32m     50[0m                     [0mmodule_name[0m[0;34m,[0m [0mclass_name[0m[0;34m,[0m [0mfunction_name[0m[0;34m,[0m [0mtime[0m[0;34m.[0m[0mperf_counter[0m[0;34m([0m[0;34m)[0m [0;34m-[0m [0mstart[0m[0;34m,[0m [0msignature[0m[0;34m[0m[0;34m[0m[0m
&amp;nbsp;
[0;32m/databricks/spark/python/pyspark/sql/readwriter.py[0m in [0;36mload[0;34m(self, path, format, schema, **options)[0m
[1;32m    182[0m             [0;32mreturn[0m [0mself[0m[0;34m.[0m[0m_df[0m[0;34m([0m[0mself[0m[0;34m.[0m[0m_jreader[0m[0;34m.[0m[0mload[0m[0;34m([0m[0mself[0m[0;34m.[0m[0m_spark[0m[0;34m.[0m[0m_sc[0m[0;34m.[0m[0m_jvm[0m[0;34m.[0m[0mPythonUtils[0m[0;34m.[0m[0mtoSeq[0m[0;34m([0m[0mpath[0m[0;34m)[0m[0;34m)[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[1;32m    183[0m         [0;32melse[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[0;32m--&amp;gt; 184[0;31m             [0;32mreturn[0m [0mself[0m[0;34m.[0m[0m_df[0m[0;34m([0m[0mself[0m[0;34m.[0m[0m_jreader[0m[0;34m.[0m[0mload[0m[0;34m([0m[0;34m)[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[0m[1;32m    185[0m [0;34m[0m[0m
[1;32m    186[0m     def json(
&amp;nbsp;
[0;32m/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py[0m in [0;36m__call__[0;34m(self, *args)[0m
[1;32m   1319[0m [0;34m[0m[0m
[1;32m   1320[0m         [0manswer[0m [0;34m=[0m [0mself[0m[0;34m.[0m[0mgateway_client[0m[0;34m.[0m[0msend_command[0m[0;34m([0m[0mcommand[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[0;32m-&amp;gt; 1321[0;31m         return_value = get_return_value(
[0m[1;32m   1322[0m             answer, self.gateway_client, self.target_id, self.name)
[1;32m   1323[0m [0;34m[0m[0m
&amp;nbsp;
[0;32m/databricks/spark/python/pyspark/sql/utils.py[0m in [0;36mdeco[0;34m(*a, **kw)[0m
[1;32m    194[0m     [0;32mdef[0m [0mdeco[0m[0;34m([0m[0;34m*[0m[0ma[0m[0;34m:[0m [0mAny[0m[0;34m,[0m [0;34m**[0m[0mkw[0m[0;34m:[0m [0mAny[0m[0;34m)[0m [0;34m-&amp;gt;[0m [0mAny[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[1;32m    195[0m         [0;32mtry[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[0;32m--&amp;gt; 196[0;31m             [0;32mreturn[0m [0mf[0m[0;34m([0m[0;34m*[0m[0ma[0m[0;34m,[0m [0;34m**[0m[0mkw[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[0m[1;32m    197[0m         [0;32mexcept[0m [0mPy4JJavaError[0m [0;32mas[0m [0me[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[1;32m    198[0m             [0mconverted[0m [0;34m=[0m [0mconvert_exception[0m[0;34m([0m[0me[0m[0;34m.[0m[0mjava_exception[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
&amp;nbsp;
[0;32m/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py[0m in [0;36mget_return_value[0;34m(answer, gateway_client, target_id, name)[0m
[1;32m    324[0m             [0mvalue[0m [0;34m=[0m [0mOUTPUT_CONVERTER[0m[0;34m[[0m[0mtype[0m[0;34m][0m[0;34m([0m[0manswer[0m[0;34m[[0m[0;36m2[0m[0;34m:[0m[0;34m][0m[0;34m,[0m [0mgateway_client[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[1;32m    325[0m             [0;32mif[0m [0manswer[0m[0;34m[[0m[0;36m1[0m[0;34m][0m [0;34m==[0m [0mREFERENCE_TYPE[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[0;32m--&amp;gt; 326[0;31m                 raise Py4JJavaError(
[0m[1;32m    327[0m                     [0;34m"An error occurred while calling {0}{1}{2}.\n"[0m[0;34m.[0m[0;34m[0m[0;34m[0m[0m
[1;32m    328[0m                     format(target_id, ".", name), value)
&amp;nbsp;
[0;31mPy4JJavaError[0m: An error occurred while calling o604.load.
: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 09 Jan 2023 10:17:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14872#M9288</guid>
      <dc:creator>Abel_Martinez</dc:creator>
      <dc:date>2023-01-09T10:17:32Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14873#M9289</link>
      <description>&lt;P&gt;Hi all, finally I found the solution by my self. The problem was with the options, which has changed in the last version.  &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Working code:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.read.format("mongodb").option("spark.mongodb.read.connection.uri" ....&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 09 Jan 2023 10:49:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14873#M9289</guid>
      <dc:creator>Abel_Martinez</dc:creator>
      <dc:date>2023-01-09T10:49:17Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark 10.x from Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14874#M9290</link>
      <description>&lt;P&gt;Hello Abel, I´m facing a similar problem. What was the option you've used for the 10.0.5?&lt;/P&gt;</description>
      <pubDate>Tue, 11 Apr 2023 21:56:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14874#M9290</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2023-04-11T21:56:38Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/95707#M39157</link>
      <description>&lt;P&gt;I was facing the same issue, now It is resolved, and thanks to&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/43003"&gt;@Abel_Martinez&lt;/a&gt;.&lt;/P&gt;&lt;P&gt;I am using this like below code:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;df &lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt; spark.read.&lt;/SPAN&gt;&lt;SPAN&gt;format&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"mongodb"&lt;/SPAN&gt;&lt;SPAN&gt;) \&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;option&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'spark.mongodb.read.connection.uri'&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"mongodb+srv://*****:*****@******/?retryWrites=true&amp;amp;w=majority&amp;amp;appName=****&amp;amp;tls=true"&lt;/SPAN&gt;&lt;SPAN&gt;) \&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;option&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'database'&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;'database_name'&lt;/SPAN&gt;&lt;SPAN&gt;) \&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;option&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;'collection'&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;'collection_name'&lt;/SPAN&gt;&lt;SPAN&gt;) \&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;load&lt;/SPAN&gt;&lt;SPAN&gt;()&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 23 Oct 2024 12:20:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/95707#M39157</guid>
      <dc:creator>ravisharma1024</dc:creator>
      <dc:date>2024-10-23T12:20:57Z</dc:date>
    </item>
    <item>
      <title>Re: Why I'm getting connection timeout when connecting to MongoDB using MongoDB Connector for Spark</title>
      <link>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/122188#M46686</link>
      <description>&lt;P&gt;Just want to throw my hat in that I was running into the same issue and this was finally the solution. Thanks for posting it Abel!&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jun 2025 20:15:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/122188#M46686</guid>
      <dc:creator>ed_martinez</dc:creator>
      <dc:date>2025-06-18T20:15:53Z</dc:date>
    </item>
  </channel>
</rss>

