cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks streaming dataframe into Snowflake

babyhari
New Contributor II

Any suggestions on how to stream data from databricks into snowflake?. Is snowpipe is the only option?. Snowpipe is not faster since it runs copy into in a small batch intervals and not in few seconds. If no option other than snowpipe, how to call it from databricks notebook?. Thx in advance for your help.

 

Below did not work since my dataDF was streaming dataframe.

 

import net.snowflake.spark.snowflake.Utils
dataDF.write
.format("snowflake")
.options(sfOptions)
.option("dbtable", "demokinesisstream")
.mode(SaveMode.Append)
.save()
2 REPLIES 2

Tharun-Kumar
Honored Contributor II
Honored Contributor II

@babyhari 

You have to use writeStream instead of write.

 

dataDF.writeStream
.option("checkpointLocation", "path-for-checkpoint')
.foreachBatch { (batchDF: Dataset[Row], batchId: Long) =>
batchDF.write
.format("snowflake")
.options(sfOptions)
.option("dbtable""demokinesisstream")
.mode(SaveMode.Append)
.save()
}
 
It is also recommended to use checkpoint directory to save the streaming metadata which help in restarting the pipeline from the previous point.

Anonymous
Not applicable

Hi @babyhari 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.