02-10-2023 09:43 AM
I have an continuous running streaming Job, I would like to stop this over weekend and start again on Monday. Here is my streaming job code.
(spark.readStream.format("delta").load(input_path)
.writeStream
.option("checkpointLocation", input_checkpoint_path)
.trigger(processingTime="1 minute")
.foreachBatch(foreachBatchFunction)
.start()
)
Regards
Sanjay
02-10-2023 11:17 AM
Hi @Sanjay Jain , You can stop your job from the UI
02-11-2023 02:07 AM
Thank you Lakshay. I am looking to stop automatically through some regex or programatically.
Its hard and chances of forgetting to stop every Friday and start again on Monday.
02-10-2023 08:42 PM
02-11-2023 02:07 AM
Thank you Mathew. I will check Continuous job feature.
02-12-2023 10:48 PM
Hi @Sanjay Jain
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
02-14-2023 04:09 AM
Hi Vidula,
I could not find continuous option as suggested by Mathew. Can you help how/where to see this option.
Regards,
Sanjay
02-16-2023 04:28 AM
@Vidula Khanna I too couldn't find the continuous option or trigger option. @Mohan Mathews can you help us in getting the option?
08-23-2023 10:41 AM
@sanjay Any luck on that, I am also looking for the solution for the same issue
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group