cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks streaming dataframe into Snowflake

babyhari
New Contributor II

Any suggestions on how to stream data from databricks into snowflake?. Is snowpipe is the only option?. Snowpipe is not faster since it runs copy into in a small batch intervals and not in few seconds. If no option other than snowpipe, how to call it from databricks notebook?. Thx in advance for your help.

 

Below did not work since my dataDF was streaming dataframe.

 

import net.snowflake.spark.snowflake.Utils
dataDF.write
.format("snowflake")
.options(sfOptions)
.option("dbtable", "demokinesisstream")
.mode(SaveMode.Append)
.save()
2 REPLIES 2

Tharun-Kumar
Honored Contributor II
Honored Contributor II

@babyhari 

You have to use writeStream instead of write.

 

dataDF.writeStream
.option("checkpointLocation", "path-for-checkpoint')
.foreachBatch { (batchDF: Dataset[Row], batchId: Long) =>
batchDF.write
.format("snowflake")
.options(sfOptions)
.option("dbtable""demokinesisstream")
.mode(SaveMode.Append)
.save()
}
 
It is also recommended to use checkpoint directory to save the streaming metadata which help in restarting the pipeline from the previous point.

Anonymous
Not applicable

Hi @babyhari 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!