cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

If multiple streams as jar is running and one fail will it fails others also

User16826994223
Honored Contributor III

Hi Team, IHAC who is running multiple stream in the jar based job, one of the streams got terminated, but other streams are processing without termination.

Is this know behaviour in case of jar based streaming application? Any insight please? (edited)

1 ACCEPTED SOLUTION

Accepted Solutions

User16826994223
Honored Contributor III
  • Failure in any of the active streaming queries causes the active run to fail and terminate all the other streaming queries. You do not need to use 
streamingQuery.awaitTermination()or 
spark.streams.awaitAnyTermination()

  •  at the end of your notebook. Jobs automatically prevent a run from completing when a streaming query is active.

But seems like You have to call the above two function to fail all streams

View solution in original post

1 REPLY 1

User16826994223
Honored Contributor III
  • Failure in any of the active streaming queries causes the active run to fail and terminate all the other streaming queries. You do not need to use 
streamingQuery.awaitTermination()or 
spark.streams.awaitAnyTermination()

  •  at the end of your notebook. Jobs automatically prevent a run from completing when a streaming query is active.

But seems like You have to call the above two function to fail all streams