Resolved! address how to use multiple spark streaming jobs connecting to one job cluster
Hi,We have a scenario where we need to deploy 15 spark streaming applications on databricks reading from kafka to single Job cluster. We tried following approach:1. create job 1 with new job cluster (C1)2. create job2 pointing to C1...3. create job15...
- 5785 Views
- 2 replies
- 4 kudos
Latest Reply
@Hubert Dudek​ , thanks a lot for responding. When we have setup like this, if one tasks fails, it will not terminate the entire job right?Since, the job is continously running as it is streaming app, is it possible to add new task to the job(while i...
- 4 kudos