Data load from Azure databricks dataframe to cosmos db container
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-19-2022 09:51 PM
I am trying to load data from Azure databricks dataframe to cosmos db container using below command
cfg = {
"spark.cosmos.accountEndpoint" : cosmosEndpoint,
"spark.cosmos.accountKey" : cosmosMasterKey,
"spark.cosmos.database" : cosmosDatabaseName,
"spark.cosmos.container" : cosmosContainerName,
}
df.write.format("cosmos.oltp").options(**cfg).mode("append").save()
My query is if the target cosmos container already have data matching with source for a particular id will it try to overwrite the id details or will it update the existing id details to add new items to it or will it only add/append new ids and won't touch existing ones?
What is upsert type and can we use along with append mode?
Background: I am trying to load customer transaction details for a retail by manually running append for different time intervals like from 2012 to 2017 and then 2017 to 2022. Same customer would have made transactions in both the periods so how will this append mode handle?
- Labels:
-
Azure
-
Azure databricks
-
Data
-
Data load
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-20-2022 01:33 PM
Hi, Partial document updates can be done using patch - https://learn.microsoft.com/en-us/azure/cosmos-db/sql/create-sql-api-spark?tabs=python#partial-docum.... Also, databricks notebook supports python , could you please reach out to Azure support (cosmos DB) team if they support append through python (which can be run from any python editor (also Databricks notebook, after the Azure Databricks implementation with Cosmos DB endpoint))?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2022 04:02 AM
Hey @Rama Santosh Ravada
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!

