Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2021 12:46 AM
Hi Team, IHAC who is running multiple stream in the jar based job, one of the streams got terminated, but other streams are processing without termination.
Is this know behaviour in case of jar based streaming application? Any insight please? (edited)
Labels:
- Labels:
-
Streams
1 ACCEPTED SOLUTION
Accepted Solutions
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2021 12:52 AM
- Failure in any of the active streaming queries causes the active run to fail and terminate all the other streaming queries. You do not need to use
streamingQuery.awaitTermination()or
spark.streams.awaitAnyTermination()
- at the end of your notebook. Jobs automatically prevent a run from completing when a streaming query is active.
But seems like You have to call the above two function to fail all streams
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2021 12:52 AM
- Failure in any of the active streaming queries causes the active run to fail and terminate all the other streaming queries. You do not need to use
streamingQuery.awaitTermination()or
spark.streams.awaitAnyTermination()
- at the end of your notebook. Jobs automatically prevent a run from completing when a streaming query is active.
But seems like You have to call the above two function to fail all streams

