cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Need help to insert huge data into cosmos db from azure data lake storage using databricks

manasa
Contributor

I am trying to insert 6GB of data into cosmos db using OLTP Connector

Container RU's:40000

Cluster Config:image.png

cfg = { 
  "spark.cosmos.accountEndpoint" : cosmosdbendpoint,
  "spark.cosmos.accountKey" : cosmosdbmasterkey,
  "spark.cosmos.database" : cosmosdatabase,
  "spark.cosmos.container" : cosmosdbcontainer,
}
 
spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog")
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", cosmosdbendpoint)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", cosmosdbmasterkey)
spark.conf.set("spark.cosmos.write.bulk.enabled", "true")
 
json_df.write.format("cosmos.oltp").options(**cfg).mode("APPEND").save()

It is taking around 3hrs for me to load into cosmos db

1.Is increasing RU's is the only approach to decrease the execution time

2.Other than OLTP connector, do we have any ways to insert bulk data within less time

3.How to calculate RU's based on data size

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @Manasa Kalluri​, This article explains how to read data from and write data to Azure Cosmos DB using Azure Databricks. 

Hi @Kaniz Fatma​ , my problem is not with the resources. I tried every thing mentioned in the article but I need to insert bulk data in less time(def not 3hrs for 6gb data).So, I am looking for a optimized way.

SteveMeckstroth
New Contributor II

You have probably found a solution, but for others that end up here I got dramatic improvements using the Mongo connector to CosmosDB: https://www.mongodb.com/docs/spark-connector/current/write-to-mongodb/

ImAbhishekTomar
New Contributor III

Did anyone find solution for this, I’m also using similar clutter and RAU and data ingestion taking lot of time….?

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!