cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Should/Can I use spark streaming for Batch workloads?

User16783855534
New Contributor III

Its preferable to use spark streaming (with Delta) for batch workloads rather then regular batch. With the trigger.once trigger whenever the streaming job is started it will process whatever is available in the source (kafka/kinesis/File System) and keep track of the progress in the streaming checkpoint location. So after it succeeds when you run the job again it will leverage checkpoint to know where to start from.

2 REPLIES 2

User16826994223
Honored Contributor III

Its preferable to use spark streaming (with Delta) for batch workloads rather then regular batch. With the trigger.once trigger whenever the streaming job is started it will process whatever is available in the source (kafka/kinesis/File System) and keep track of the progress in the streaming checkpoint location. So after it succeeds when you run the job again it will leverage checkpoint to know where to start from

User16869510359
Esteemed Contributor

The streaming checkpoint mechanism is independent of the Trigger type. The way checkpoint works are it creates an offset file when processing the batch and once the batch is completed it creates a commit file for that batch in the checkpoint directory. Irrespective of the Trigger type, whenever a batch starts it will first reconcile the offset and checkpoint directory to identify where it has to resume.

These files are human-readable and can be seen in the checkpoint directory.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.