cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure Databricks Status

noorbasha534
Contributor III

Dear all,

I wanted to check if anyone implemented the solution of capturing information from Databricks status page in real-time 24x7 and load that into a log or table...

https://learn.microsoft.com/en-us/azure/databricks/resources/status

what is the best way to be informed about Databricks availability? Just to create a webhook and get notified?

My requirement is to also report on the service downtimes, and device several KPIs out of it...

Appreciate the mindshare..

 

3 REPLIES 3

TheRealOliver
Contributor

It seems that the webhook is the way!

There is nothing about system status in Databricks REST API.

There is nothing about system status in the System Tables schema.

@TheRealOliver thanks. Yes, but have you come across someone doing his, i mean listening to the status 365 days a year 24x7....with current mail subscription, we have alert fatigue as our mail box is also leveraged for other workloads also 🙂

I agree that status updates must be useful and actionable.

If I understood your question correctly, you want to have the Databricks service notifications to be available in a Databricks table. I wouldn't use email subscription for this. I would use the webhook approach.

I am not proficient with Azure though so I can only describe the design in general terms:

  • Deploy a serverless HTTP endpoint (maybe with Azure Logic Apps or Azure Functions). The purpose of the endpoint would be to save incoming requests to blob storage.
  • Submit the endpoint URL to the status page and configure which notifications you want to receive. 
  • Set up incremental ingestion from blob storage in Databricks and trigger it by a schedule at a desired frequency to collect the notifications from blob storage and save them to a Delta table.
  • Set up downstream processes that would trigger other actions in response to specific notifications.

I hope this helps!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now