Servicenow integration with AWS databricks

Narendra_v
New Contributor III

Hi Databricks Community,

We’re running Databricks on AWS and would like to improve operational incident management for production workloads.

  1. Is there any official Databricks documentation or recommended approach to integrate with ServiceNow for automated incident/ticket creation (e.g., on job failure, cluster issues, etc.)?

  2. For high-priority job failures, what are the best options to configure real-time notifications to a phone (SMS/voice/push)?

    • Are there native capabilities in Databricks Workflows/Jobs for this, or is the recommended pattern to integrate with AWS services (SNS, EventBridge, PagerDuty/Opsgenie, etc.)?

Any guidance, reference architectures, or example implementations would be appreciated.

Thanks,

Narendra Vempala 

@Naren.Samurai

anshu_roy
Databricks Employee
Databricks Employee

Hello, you can integrate Databricks job failures with ServiceNow using webhooks from Jobs.

  • In ServiceNow, create an inbound REST or Scripted REST API that takes the JSON payload and creates an incident.

  • In Databricks, edit the job, add a notification, choose a Webhook system destination, and point it to the ServiceNow endpoint; enable the on failure notification type so failed runs trigger incident creation.

You can find the exact UI steps and payload details here:

Narendra_v
New Contributor III

@anshu_roy  Hello, Thanks for the reply,

Any sample code for rest API (that will fetch the job long along with name and time ) would be helpful.

@Naren.Samurai