โ02-10-2023 09:43 AM
I have an continuous running streaming Job, I would like to stop this over weekend and start again on Monday. Here is my streaming job code.
(spark.readStream.format("delta").load(input_path)
.writeStream
.option("checkpointLocation", input_checkpoint_path)
.trigger(processingTime="1 minute")
.foreachBatch(foreachBatchFunction)
.start()
)
Regards
Sanjay
โ02-10-2023 11:17 AM
Hi @Sanjay Jainโ , You can stop your job from the UI
โ02-11-2023 02:07 AM
Thank you Lakshay. I am looking to stop automatically through some regex or programatically.
Its hard and chances of forgetting to stop every Friday and start again on Monday.
โ02-10-2023 08:42 PM
โ02-11-2023 02:07 AM
Thank you Mathew. I will check Continuous job feature.
โ02-12-2023 10:48 PM
Hi @Sanjay Jainโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
โ02-14-2023 04:09 AM
Hi Vidula,
I could not find continuous option as suggested by Mathew. Can you help how/where to see this option.
Regards,
Sanjay
โ02-16-2023 04:28 AM
@Vidula Khannaโ I too couldn't find the continuous option or trigger option. @Mohan Mathewsโ can you help us in getting the option?
โ08-23-2023 10:41 AM
@sanjay Any luck on that, I am also looking for the solution for the same issue
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group