spark cluster monitoring and visibility
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2022 12:33 AM
Hey. I'm working on a project where I'd like to be able to view and play around with the spark cluster metrics. I'd like to know what the utilization % and max values are for metrics like CPU, memory and network. I've tried using some open source solutions(https://github.com/mspnp/spark-monitoring) but I'm not really getting what I'm looking for. Ideally, a solution that could give me insights on my Azure Databricks instances to optimize usage would be perfect. Currently, I can access some of these metrics on the metrics tab on my spark cluster page as static images but it'd be great if I could export that information to make my own insights or graphs.
- Labels:
-
Azure
-
Spark Cluster
-
Spark monitoring
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2022 03:09 AM
Hello @Saurav Santhosh
There are few options for this
If you would like to see VM usage you can send these metrics to the Azure Log Analytics workspace
https://github.com/Azure/AzureDatabricksBestPractices/blob/master/toc.md#Appendix-A
Databricks solution
https://databrickslabs.github.io/overwatch/
Datadog
https://www.datadoghq.com/blog/databricks-monitoring-datadog/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2022 04:42 AM
Prometheus was added in spark 3.0
https://databricks.com/session_na20/native-support-of-prometheus-monitoring-in-apache-spark-3-0
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-22-2022 02:38 AM
Hey @Kaniz Fatma, I Appreciate the suggestions and will be looking into them. Haven't gotten to it yet so I didn't want to mention whether they worked for me or not. Since I'm looking to avoid solutions like DataDog, I'll be checking out the Prometheus and @Arvind Ravish's first suggestion. Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-29-2022 01:07 PM
Just a friendly follow-up. Did you have time to check? do you still need help or can you mark as best the response that helped you?