Im trying to create a system where i let spark finish the current microbatch, and letting it know it should stop after it.
The reason is that i don't want to re-calcualte a microbatch with "forcefully" stopping a stream.
Is there a way spark/databricks already implemented this?
My current approach is just raising exception at the start of the microbatch, if an indication of gracefull stop was given.
Thanks