โ04-01-2024 09:52 PM
How to retrieve all Databricks performance metrics on an hourly basis. Is there a recommended method or API available for retrieving performance metrics ?
โ04-02-2024 07:31 PM
Hi @Retired_mod
I encountered the "Error: getaddrinfo ENOTFOUND http" while attempting to run the API.
Could you please provide guidance on resolving this issue? Are there any prerequisites that need to be followed before running the API? I'd appreciate any assistance you can offer.
โ04-02-2024 05:38 AM
Thanks @Retired_mod for your response.
Could you please provide guidance on how we can achieve the step3 in AWS services?
โ04-03-2024 02:02 AM
Hi @Nandhini_Kumar, there's many performance metrics available - it depends on what you're looking to do with this data, and how you look to take action in real time. I would strongly recommend mapping out a user journey so you get only the metrics you need, and don't waste time trying to get these all hourly when they won't be used.
API docs can be found here: https://docs.databricks.com/api/workspace/introduction
System tables: https://docs.databricks.com/en/administration-guide/system-tables/index.html
2 weeks ago
Any response here? Are there any API that expose this? Specially the Job performance metrics?
Monday
Hi @rahuja can you be more specific on 'job performance'? Do you mean the spark metrics? Whether it was successful? How much it cost?
Monday
@holly yes we need spark metrics e.g: amount of Compute used, amount of memory used etc.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group