Databricks + Snowflake Snowpipe Streaming
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2023 04:28 AM
Does anyone know whether it is possible to use Databricks Snowflake Connector together with the latest Snowflake feature which is Snowpipe Streaming?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2023 06:17 AM
At its core, Snowpipe is a tool to copy data into Snowflake from cloud storage. Snowpipe is not about a streaming, but about how to batch load data from cloud storage into a table on a recurring basis.
Databricks has a similar feature that we call Auto Loader. Auto loader enables developers to create a Spark Structured Streaming pipeline with cloud files as a data source.
Use Structured Streaming or Delta Live Tables to implement streaming use-cases.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2023 06:33 AM
The use case that we're trying to solve is loading data from Kafka into Snowflake using Databricks, we could use the Databricks Snowflake Connector but only if it's able to use Snowpipe Streaming underneath.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2023 06:35 AM
why do you need Snowpipe at all ? Connect to Kafka directly from Databricks : check https://www.dbdemos.ai/demo.html?demoName=streaming-sessionization
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2023 08:22 AM
With typical Snowflake Connector we cannot omit the Snowflake Warehouse which adds to the cost of the whole operation, whereas when we're using Snowpipe Streaming we can omit the Warehouse and reduce the cost: snowpipe-streaming-cost .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2023 01:19 PM
@Piotr Tutak , I believe you don't need Snowflake at all - just use source files / events from your Data Lake / message brocker to process it whithin Databricks. Auto-loader that might be combined with DLT.
With Auto loader you'll benefit also from the possibility to work in streaming mode and switch to batch if needed. DLT allows to guarantee data quality. Obviously much more value added features might be listed here if we compare vs snowpipe. Among them you'll be able to estimate and control costs - something that is still not possible with snowpipe, unfortunately.

