Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Is there a way to ingest Azure CosmosDB data via Delta Live Tables? If I use regular workflows it works well, but with DLT I'm not able to set CosmosDB Connector on a cluster.
In DLT, you can install libraries only through pip. The problem is that the CosmosDB driver is installed through Maven / JAR. I don't think it is preinstalled.
Maybe you can send the change data feed to Event Hub and read Event Hub in DLT as Kafka.
In DLT, you can install libraries only through pip. The problem is that the CosmosDB driver is installed through Maven / JAR. I don't think it is preinstalled.
Maybe you can send the change data feed to Event Hub and read Event Hub in DLT as Kafka.
I would likely do the same, but instead send the change data feed to Event Hub and materialize in abfss as json or parquet or some format and consume via DLT using autoloader.