Servicenow integration with AWS databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2026 11:49 AM
Hi Databricks Community,
We’re running Databricks on AWS and would like to improve operational incident management for production workloads.
Is there any official Databricks documentation or recommended approach to integrate with ServiceNow for automated incident/ticket creation (e.g., on job failure, cluster issues, etc.)?
For high-priority job failures, what are the best options to configure real-time notifications to a phone (SMS/voice/push)?
- Are there native capabilities in Databricks Workflows/Jobs for this, or is the recommended pattern to integrate with AWS services (SNS, EventBridge, PagerDuty/Opsgenie, etc.)?
Any guidance, reference architectures, or example implementations would be appreciated.
Thanks,
Narendra Vempala
- Labels:
-
Workflows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-13-2026 03:11 AM
Hello, you can integrate Databricks job failures with ServiceNow using webhooks from Jobs.
-
In ServiceNow, create an inbound REST or Scripted REST API that takes the JSON payload and creates an incident.
-
In Databricks, edit the job, add a notification, choose a Webhook system destination, and point it to the ServiceNow endpoint; enable the on failure notification type so failed runs trigger incident creation.
You can find the exact UI steps and payload details here:
-
Job notifications: https://docs.databricks.com/aws/en/jobs/notifications
-
Webhook destinations: https://docs.databricks.com/aws/en/admin/workspace-settings/notification-destinations
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-15-2026 04:39 AM
@anshu_roy Hello, Thanks for the reply,
Any sample code for rest API (that will fetch the job long along with name and time ) would be helpful.