cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Is it possible to disable retryWrites using .option()?

tigger
New Contributor III

Hello everyone,

I'm trying to write to DocumentDB using org.mongodb.spark:mongo-spark-connector_2.12:3.0.1. The DocDB is version 4 which doesn't support Retryable Writes so I disabled the feature setting option "retryWrites" to "false" (also tried with False). However it didn't work. Do you know why?

uri = "mongodb://username:password@host.docdb.amazonaws.com:27017" 
(df.write.format("mongo")
 .option("uri", uri)
 .option("retryWrites", "false")  
 .option("database", "mydb")
 .option("collection", "employee")
 .mode("append")
 .save())
---
Command failed with error 301: Retryable writes are not supported

I tried to set the option directly on the uri like below, then it works, but I'd like to use .option() to set all the connection options. Is it possible?

uri = "mongodb://username:password@host.docdb.amazonaws.com:27017/?retryWrites=false"

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions

Sajesh
New Contributor III
New Contributor III

Hi @Hugh Vo​ ,

I can't find retryWrites available as an option in the MongoDB connector: https://docs.mongodb.com/spark-connector/current/configuration/#input-configuration.

Looks like, it has to be passed as part of the URI.

View solution in original post

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @ tigger! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

Sajesh
New Contributor III
New Contributor III

Hi @Hugh Vo​ ,

I can't find retryWrites available as an option in the MongoDB connector: https://docs.mongodb.com/spark-connector/current/configuration/#input-configuration.

Looks like, it has to be passed as part of the URI.

tigger
New Contributor III

Thanks @Sajesh Manakkunnath​ 

Anonymous
Not applicable

@Hugh Vo​ - If Sajehs's answer resolved the issue, would you be happy to mark their answer as best?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.