<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic should have the option to mark succeeded with failures as a failure rather than a success in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/126710#M47745</link>
    <description>&lt;P&gt;Hi we are having an issue with the way succeeded with failures is handled. We will get emails telling us that we have a failure, which is correct, but then the pipeline actually treats it like a success and keeps going, but actually we would like to report the whole process as a failure. We have a pretty nested series of workflows that looks like this&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;- main workflow
  |- sub workflow
      |- sub sub workflow
          |- some failing task that isn't a leaf/terminating node&lt;/LI-CODE&gt;&lt;P&gt;Now here's the same structure above but with notes about our issues with the state (this looked messy hence putting just the structure above):&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;- main workflow [email says "success", state in databricks says "succeeded", but this is wrong because sub workflow should really be classified as a failure, not a success]
  |- sub workflow [email says "success", state in databricks says "succeeded", but this is wrong because sub sub workflow should really be classified as a failure, not a success]
      |- sub sub workflow [email sent says "failure", state in databricks says "succeeded with failures", ideally this should probably just be a failure]
          |- some failing task that isn't a leaf/terminating node [marked as failure]&lt;/LI-CODE&gt;&lt;P&gt;So in summary, it would be ideal to mark a workflow run that has "succeeded with faillures" as a failure rather than a succeess. I haven't found many people talking about this but did &lt;A href="https://www.reddit.com/r/databricks/comments/1bzouz8/addressing_pipeline_error_handling_in_databricks/" target="_self"&gt;find someone on reddit talking about the same issue&lt;/A&gt;.&lt;/P&gt;</description>
    <pubDate>Mon, 28 Jul 2025 14:00:52 GMT</pubDate>
    <dc:creator>kenmyers-8451</dc:creator>
    <dc:date>2025-07-28T14:00:52Z</dc:date>
    <item>
      <title>should have the option to mark succeeded with failures as a failure rather than a success</title>
      <link>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/126710#M47745</link>
      <description>&lt;P&gt;Hi we are having an issue with the way succeeded with failures is handled. We will get emails telling us that we have a failure, which is correct, but then the pipeline actually treats it like a success and keeps going, but actually we would like to report the whole process as a failure. We have a pretty nested series of workflows that looks like this&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;- main workflow
  |- sub workflow
      |- sub sub workflow
          |- some failing task that isn't a leaf/terminating node&lt;/LI-CODE&gt;&lt;P&gt;Now here's the same structure above but with notes about our issues with the state (this looked messy hence putting just the structure above):&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;- main workflow [email says "success", state in databricks says "succeeded", but this is wrong because sub workflow should really be classified as a failure, not a success]
  |- sub workflow [email says "success", state in databricks says "succeeded", but this is wrong because sub sub workflow should really be classified as a failure, not a success]
      |- sub sub workflow [email sent says "failure", state in databricks says "succeeded with failures", ideally this should probably just be a failure]
          |- some failing task that isn't a leaf/terminating node [marked as failure]&lt;/LI-CODE&gt;&lt;P&gt;So in summary, it would be ideal to mark a workflow run that has "succeeded with faillures" as a failure rather than a succeess. I haven't found many people talking about this but did &lt;A href="https://www.reddit.com/r/databricks/comments/1bzouz8/addressing_pipeline_error_handling_in_databricks/" target="_self"&gt;find someone on reddit talking about the same issue&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Mon, 28 Jul 2025 14:00:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/126710#M47745</guid>
      <dc:creator>kenmyers-8451</dc:creator>
      <dc:date>2025-07-28T14:00:52Z</dc:date>
    </item>
    <item>
      <title>Re: should have the option to mark succeeded with failures as a failure rather than a success</title>
      <link>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/126811#M47775</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/147746"&gt;@kenmyers-8451&lt;/a&gt;!&lt;/P&gt;
&lt;P&gt;It's a valid product feedback that would be worth flagging.&lt;BR /&gt;In the meantime, to ensure your pipeline run fails when any sub-task fails, you can add a final sentinel task at the end of your workflow. This task should programmatically inspect the state of all prior tasks. If it detects any task that failed or was only partially successful, it can raise an error, causing the overall workflow to fail and trigger accurate notifications.&lt;/P&gt;</description>
      <pubDate>Tue, 29 Jul 2025 14:50:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/126811#M47775</guid>
      <dc:creator>Advika</dc:creator>
      <dc:date>2025-07-29T14:50:29Z</dc:date>
    </item>
    <item>
      <title>Re: should have the option to mark succeeded with failures as a failure rather than a success</title>
      <link>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/127463#M47976</link>
      <description>&lt;P&gt;thanks&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/152834"&gt;@Advika&lt;/a&gt;&amp;nbsp;we'll give that a shot for now&lt;/P&gt;</description>
      <pubDate>Tue, 05 Aug 2025 13:31:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/127463#M47976</guid>
      <dc:creator>kenmyers-8451</dc:creator>
      <dc:date>2025-08-05T13:31:29Z</dc:date>
    </item>
    <item>
      <title>Re: should have the option to mark succeeded with failures as a failure rather than a success</title>
      <link>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/151646#M53673</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/152834"&gt;@Advika&lt;/a&gt;&amp;nbsp; Currently, if we put&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;run_if&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;ALL_DONE on task level, it is flagged as succeededwithfailures. But there is another workflow under which this workflow runs. In master workflow, it is marked as succeeded only not&amp;nbsp;succeededwithfailures.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;My orchestration is done as below:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;master workflow:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp;sub workflow:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; task 1:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; task 2:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;Task 1 got failed and sub workflow is marked as&amp;nbsp;succeededwithfailures but when i saw logs of master workflow, it is showing sub workflow as succeeded. Do you have any idea when this can be fixed?&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Sun, 22 Mar 2026 13:48:23 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/should-have-the-option-to-mark-succeeded-with-failures-as-a/m-p/151646#M53673</guid>
      <dc:creator>Anish_2</dc:creator>
      <dc:date>2026-03-22T13:48:23Z</dc:date>
    </item>
  </channel>
</rss>

