cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure CosmosDB change feed ingestion via DLT

Ogi
New Contributor II

Is there a way to ingest Azure CosmosDB data via Delta Live Tables? If I use regular workflows it works well, but with DLT I'm not able to set CosmosDB Connector on a cluster.

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

In DLT, you can install libraries only through pip. The problem is that the CosmosDB driver is installed through Maven / JAR. I don't think it is preinstalled.

Maybe you can send the change data feed to Event Hub and read Event Hub in DLT as Kafka.

https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/changefeed-ecommerce-solution

View solution in original post

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

In DLT, you can install libraries only through pip. The problem is that the CosmosDB driver is installed through Maven / JAR. I don't think it is preinstalled.

Maybe you can send the change data feed to Event Hub and read Event Hub in DLT as Kafka.

https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/changefeed-ecommerce-solution

Jfoxyyc
Valued Contributor

I would likely do the same, but instead send the change data feed to Event Hub and materialize in abfss as json or parquet or some format and consume via DLT using autoloader.

Ogi
New Contributor II

Thanks a lot! Just wanted to doublecheck whether this natively exists.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.