<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Spark connector to mongodb - mongo-spark-connector_2.12:10.1.1 in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/38525#M2002</link>
    <description>&lt;P&gt;In version 10.x of MongoDB Spark Connector some configuration options have changed.&lt;/P&gt;&lt;P&gt;Now you have to pass&amp;nbsp;&lt;SPAN&gt;&lt;STRONG&gt;spark.mongodb.read.connection.uri&lt;/STRONG&gt; instead of&amp;nbsp;&lt;STRONG&gt;spark.mongodb.input.uri&lt;/STRONG&gt;.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Checkout the new other options in&amp;nbsp;&lt;A href="https://www.mongodb.com/docs/spark-connector/v10.2/configuration/read/#connection.uri-configuration-setting" target="_blank"&gt;Read Configuration Options — MongoDB Spark Connector&lt;/A&gt;.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 26 Jul 2023 21:34:38 GMT</pubDate>
    <dc:creator>silvadev</dc:creator>
    <dc:date>2023-07-26T21:34:38Z</dc:date>
    <item>
      <title>Spark connector to mongodb - mongo-spark-connector_2.12:10.1.1</title>
      <link>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/6037#M271</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; I´ve added a library to the cluster and it appears in SPARK UI as Added By User&lt;/P&gt;&lt;P&gt;spark://10.139.64.4:43001/jars/addedFile307892533757162075org_mongodb_spark_mongo_spark_connector_2_12_10_1_1-98946.jarAdded By User&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm trying to connect using the following SparkSession configuration, but it is not working:&lt;/P&gt;&lt;P&gt;spark = (SparkSession.builder.config('spark.mongodb.input.uri',connectionString).config('spark.jars.packages', 'org.mongodb.spark:mongo-spark-connector_2.12:10.1.1').getOrCreate())&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If I uninstall this library and install the previous one the , 2.12:3.0.1, the conection works.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Does anyone can help me with that?&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Tue, 11 Apr 2023 21:49:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/6037#M271</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2023-04-11T21:49:02Z</dc:date>
    </item>
    <item>
      <title>Re: Spark connector to mongodb - mongo-spark-connector_2.12:10.1.1</title>
      <link>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/6038#M272</link>
      <description>&lt;P&gt;It looks like you are trying to connect to MongoDB using the mongo-spark-connector_2.12:10.1.1 library, but you are facing issues with the connection. Here are a few things you can try to resolve the issue:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Double-check the connection string: Make sure that the connection string you are using is correct and has the right format. You can verify this by connecting to MongoDB directly using the mongo shell or a MongoDB client.&lt;/LI&gt;&lt;LI&gt;Check the Spark logs: Look for any error messages in the Spark logs. This might give you some clues about the issue. You can access the logs from the Spark UI or by running the following command:&lt;/LI&gt;&lt;/OL&gt;&lt;PRE&gt;&lt;CODE&gt;$SPARK_HOME/bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client --driver-memory 4g --executor-memory 2g --executor-cores 1 --num-executors 2 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs:///spark-history --conf spark.history.fs.logDirectory=hdfs:///spark-history --jars /path/to/mongo-spark-connector_2.12-10.1.1.jar /path/to/your/application.jar&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;3) Try a different version of the library: If the above two steps don't work, you can try using a different version of the mongo-spark-connector library. You can find the list of available versions here: &lt;A href="https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector_2.12" alt="https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector_2.12" target="_blank"&gt;https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector_2.12&lt;/A&gt;&lt;/P&gt;&lt;P&gt;4) Check compatibility with MongoDB server version: Make sure that the version of mongo-spark-connector library you are using is compatible with the version of MongoDB server you are using. You can check the compatibility matrix here: &lt;A href="https://docs.mongodb.com/spark-connector/master/#compatibility-matrix" alt="https://docs.mongodb.com/spark-connector/master/#compatibility-matrix" target="_blank"&gt;https://docs.mongodb.com/spark-connector/master/#compatibility-matrix&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I hope these suggestions help you resolve the issue.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 16 Apr 2023 01:13:12 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/6038#M272</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2023-04-16T01:13:12Z</dc:date>
    </item>
    <item>
      <title>Re: Spark connector to mongodb - mongo-spark-connector_2.12:10.1.1</title>
      <link>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/38525#M2002</link>
      <description>&lt;P&gt;In version 10.x of MongoDB Spark Connector some configuration options have changed.&lt;/P&gt;&lt;P&gt;Now you have to pass&amp;nbsp;&lt;SPAN&gt;&lt;STRONG&gt;spark.mongodb.read.connection.uri&lt;/STRONG&gt; instead of&amp;nbsp;&lt;STRONG&gt;spark.mongodb.input.uri&lt;/STRONG&gt;.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Checkout the new other options in&amp;nbsp;&lt;A href="https://www.mongodb.com/docs/spark-connector/v10.2/configuration/read/#connection.uri-configuration-setting" target="_blank"&gt;Read Configuration Options — MongoDB Spark Connector&lt;/A&gt;.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 21:34:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/38525#M2002</guid>
      <dc:creator>silvadev</dc:creator>
      <dc:date>2023-07-26T21:34:38Z</dc:date>
    </item>
    <item>
      <title>Re: Spark connector to mongodb - mongo-spark-connector_2.12:10.1.1</title>
      <link>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/39059#M2023</link>
      <description>&lt;P&gt;I face similar problem with anything above&amp;nbsp;org.mongodb.spark:mongo-spark-connector_2.12:3.0.1&lt;BR /&gt;So &lt;STRONG&gt;version 10+&lt;/STRONG&gt; of&amp;nbsp;org.mongodb.spark:mongo-spark-connector_2.12 from&amp;nbsp;&lt;A href="https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector_2.12" target="_blank" rel="noopener"&gt;https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector_2.12&lt;/A&gt;&amp;nbsp;are &lt;STRONG&gt;not working&lt;/STRONG&gt; with Databricks 12.2 LTS&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":persevering_face:"&gt;😣&lt;/span&gt;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;org.apache.spark.SparkClassNotFoundException: [DATA_SOURCE_NOT_FOUND] Failed to find data source: mongo. Please find packages at `https://spark.apache.org/third-party-projects.html`.&lt;/LI-CODE&gt;&lt;P&gt;Is there anything that should be done additionally to installing it into Library? Maybe some additional cluster option?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Aug 2023 22:12:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/39059#M2023</guid>
      <dc:creator>DmytroSokhach</dc:creator>
      <dc:date>2023-08-03T22:12:04Z</dc:date>
    </item>
    <item>
      <title>Re: Spark connector to mongodb - mongo-spark-connector_2.12:10.1.1</title>
      <link>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/50109#M2706</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/85844"&gt;@DmytroSokhach&lt;/a&gt;&amp;nbsp; I think it works if you change &lt;STRONG&gt;mongo&lt;/STRONG&gt; to &lt;STRONG&gt;mongodb&lt;/STRONG&gt; in the options. and use&amp;nbsp;&lt;SPAN&gt;&lt;STRONG&gt;spark.mongodb.read.connection.uri&lt;/STRONG&gt;&amp;nbsp;instead of&amp;nbsp;&lt;STRONG&gt;spark.mongodb.input.uri&amp;nbsp;&lt;/STRONG&gt;as&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/85412"&gt;@silvadev&lt;/a&gt;&amp;nbsp;suggested.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 30 Oct 2023 11:46:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/spark-connector-to-mongodb-mongo-spark-connector-2-12-10-1-1/m-p/50109#M2706</guid>
      <dc:creator>FurqanAmin</dc:creator>
      <dc:date>2023-10-30T11:46:01Z</dc:date>
    </item>
  </channel>
</rss>

