<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How do you manage alerts? in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/142007#M4649</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/201015"&gt;@smirnoal&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;If you want to have more dynamic behaviour you can use&amp;nbsp;Python for Databricks Asset Bundles which extends&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/" target="_blank"&gt;Databricks Asset Bundles&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;so that you can:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Define resources in Python code. These definitions can coexist with&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/resources#resource-types" target="_blank"&gt;resources defined in YAML&lt;/A&gt;.&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;Dynamically create resources using metadata&lt;/STRONG&gt;. See&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/python/#metadata" target="_blank"&gt;Create resources using metadata&lt;/A&gt;.&lt;/LI&gt;&lt;LI&gt;Modify resources defined in YAML or Python during bundle deployment. See&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/python/#modify" target="_blank"&gt;Modify resources defined in YAML or Python&lt;/A&gt;.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Regarding following question - it should work exactly like that. Under the hood DAB uses terraform. So anytime you deploy your changes terraform will compare state file with changes introduce by your code. If it will detect that alert was removed from your code it will try to remove this alert from your environment.&lt;/P&gt;&lt;P&gt;"&lt;SPAN&gt;I also would like to understand if I can create/update/delete alerts without interfering with other teams, as they run their workflows and alerts in the same workspace. If I remove one of my alerts from the repository, will DABS deploy&amp;nbsp; command detect and safely remove from the Databricks workspace? I'm aiming for the "single source of truth" model, where what I have in the repo would be reflected in the Databricks."&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 16 Dec 2025 16:50:33 GMT</pubDate>
    <dc:creator>szymon_dybczak</dc:creator>
    <dc:date>2025-12-16T16:50:33Z</dc:date>
    <item>
      <title>How do you manage alerts?</title>
      <link>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/141995#M4648</link>
      <description>&lt;P&gt;Hey all,&lt;/P&gt;&lt;P&gt;I'm curious how do teams manage Databricks alerts?&lt;/P&gt;&lt;P&gt;My use case is that I have around 10 Spark workflows, and need to validate their output tables.&lt;/P&gt;&lt;P&gt;My first iteration was to create alerts manually, e.g. define SQL, evaluation criteria, notification email, schedule etc. If anything has to be changed, I would go and modify this alert manually.&lt;/P&gt;&lt;P&gt;This approach doesn't scale well as you might have imagined. With more people in the team, and more workflows, alerts management became a bit chaotic.&lt;/P&gt;&lt;P&gt;I am looking at the DABS approach, to codify alerts and deploy them through CI/CD, but it lack ergonomics in my opinion. The alert definition has notification text and sql embedded, which makes it hard to make changes to it - json is hardly readable&lt;/P&gt;&lt;P&gt;I also would like to understand if I can create/update/delete alerts without interfering with other teams, as they run their workflows and alerts in the same workspace. If I remove one of my alerts from the repository, will DABS deploy&amp;nbsp; command detect and safely remove from the Databricks workspace? I'm aiming for the "single source of truth" model, where what I have in the repo would be reflected in the Databricks.&lt;/P&gt;&lt;P&gt;I would also avoid hardcoding the warehouse id in the alerts definition. It would be great to select it either by tag, or by size.&lt;/P&gt;&lt;P&gt;Could you please share your experience managing alerts in your team?&lt;/P&gt;</description>
      <pubDate>Tue, 16 Dec 2025 15:35:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/141995#M4648</guid>
      <dc:creator>smirnoal</dc:creator>
      <dc:date>2025-12-16T15:35:47Z</dc:date>
    </item>
    <item>
      <title>Re: How do you manage alerts?</title>
      <link>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/142007#M4649</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/201015"&gt;@smirnoal&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;If you want to have more dynamic behaviour you can use&amp;nbsp;Python for Databricks Asset Bundles which extends&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/" target="_blank"&gt;Databricks Asset Bundles&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;so that you can:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Define resources in Python code. These definitions can coexist with&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/resources#resource-types" target="_blank"&gt;resources defined in YAML&lt;/A&gt;.&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;Dynamically create resources using metadata&lt;/STRONG&gt;. See&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/python/#metadata" target="_blank"&gt;Create resources using metadata&lt;/A&gt;.&lt;/LI&gt;&lt;LI&gt;Modify resources defined in YAML or Python during bundle deployment. See&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/python/#modify" target="_blank"&gt;Modify resources defined in YAML or Python&lt;/A&gt;.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Regarding following question - it should work exactly like that. Under the hood DAB uses terraform. So anytime you deploy your changes terraform will compare state file with changes introduce by your code. If it will detect that alert was removed from your code it will try to remove this alert from your environment.&lt;/P&gt;&lt;P&gt;"&lt;SPAN&gt;I also would like to understand if I can create/update/delete alerts without interfering with other teams, as they run their workflows and alerts in the same workspace. If I remove one of my alerts from the repository, will DABS deploy&amp;nbsp; command detect and safely remove from the Databricks workspace? I'm aiming for the "single source of truth" model, where what I have in the repo would be reflected in the Databricks."&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 16 Dec 2025 16:50:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/142007#M4649</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-12-16T16:50:33Z</dc:date>
    </item>
    <item>
      <title>Re: How do you manage alerts?</title>
      <link>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/142012#M4650</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Thanks for your reply!&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I want to understand this part better:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&amp;gt; If it will detect that alert was removed from your code it will try to remove this alert from your environment.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Alerts are defined in the same repositories, as spark workflows.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Let's consider this example:&lt;/P&gt;&lt;P&gt;WorkflowA: alert1,&amp;nbsp;alert2,&amp;nbsp;alert3&lt;/P&gt;&lt;P&gt;WorkflowB: alert4,&amp;nbsp;alert5,&amp;nbsp;alert6&lt;BR /&gt;&lt;BR /&gt;so if I want to remove alert1, then my desired state is: alert2,&amp;nbsp;alert3,&amp;nbsp;alert4,&amp;nbsp;alert5,&amp;nbsp;alert6&lt;/P&gt;&lt;P&gt;When I do deployment from WorkflowA repository, it doesn't know anything about WorkflowB's alerts(alert4,&amp;nbsp;alert5,&amp;nbsp;alert6), right? So it would remove unknowns and leave only alert2 and alert3?&lt;/P&gt;&lt;P&gt;OR&lt;/P&gt;&lt;P&gt;should I host all alerts in just a single repository where all alerts for all workflows are defined?&lt;/P&gt;</description>
      <pubDate>Tue, 16 Dec 2025 17:21:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/how-do-you-manage-alerts/m-p/142012#M4650</guid>
      <dc:creator>smirnoal</dc:creator>
      <dc:date>2025-12-16T17:21:17Z</dc:date>
    </item>
  </channel>
</rss>

