Hi we are having an issue with the way succeeded with failures is handled. We will get emails telling us that we have a failure, which is correct, but then the pipeline actually treats it like a success and keeps going, but actually we would like to report the whole process as a failure. We have a pretty nested series of workflows that looks like this
- main workflow
|- sub workflow
|- sub sub workflow
|- some failing task that isn't a leaf/terminating node
Now here's the same structure above but with notes about our issues with the state (this looked messy hence putting just the structure above):
- main workflow [email says "success", state in databricks says "succeeded", but this is wrong because sub workflow should really be classified as a failure, not a success]
|- sub workflow [email says "success", state in databricks says "succeeded", but this is wrong because sub sub workflow should really be classified as a failure, not a success]
|- sub sub workflow [email sent says "failure", state in databricks says "succeeded with failures", ideally this should probably just be a failure]
|- some failing task that isn't a leaf/terminating node [marked as failure]
So in summary, it would be ideal to mark a workflow run that has "succeeded with faillures" as a failure rather than a succeess. I haven't found many people talking about this but did find someone on reddit talking about the same issue.