10-04-2022 11:46 PM
We have a Spark pipeline producing more than 3k Spark jobs. After the pipeline finishes and the cluster shuts down, only a subset (<1k) of these can be recovered from the Spark UI.
We would like to have access to the full Spark UI after the pipeline terminated and the cluster shut down. This is for performance monitoring purposes. Is it possible to deploy a Spark History Server in Databricks? If not, what is your recommended approach?
10-06-2022 12:08 AM
Hi @Vlad Crisan , As of now, A workspace is limited to 1000 concurrent job runs. A 429 Too Many Requests response is returned when you request a run that cannot start immediately. The number of jobs a workspace can create in an hour is limited to 10000 (includes “runs submit”).
https://docs.databricks.com/workflows/jobs/jobs.html#create-run-and-manage-databricks-jobs
To increase the jobs limit in a workspace you can refer below:
https://docs.databricks.com/administration-guide/workspace/enable-increased-jobs-limit.html
10-17-2022 06:56 AM
Hi @Debayan Mukherjee my question is about the total number of Spark jobs on one cluster and how these can be retrieved in the Spark UI after the cluster shuts down, rather than the number of concurrent jobs. Concrete example: if a notebook running on a cluster produces (e.g. sequentially) in total 3k jobs, after the underlying cluster shuts down, I would only be able to see in the Spark UI approximately 1k jobs. My question is if there is a way to recover all jobs in the Spark UI.
10-13-2022 08:36 AM
It depends on what data you need. It can be good to integrate with datadog https://www.datadoghq.com/blog/databricks-monitoring-datadog/
You can also redirect logs to Azure analitycs.
10-17-2022 06:59 AM
I would like to recover all information displayed in the Spark UI. Datadog is a good suggestion, but unfortunately we can't use external services in our application. Azure analytics would be an option, but I couldn't find any reference showing how to recover the full Spark UI through it.
06-02-2023 03:29 AM
@Vlad Crisan , you can use the Databricks clusters to replay the events. Please follow this kb: https://kb.databricks.com/clusters/replay-cluster-spark-events
Note: Please spin up a cluster with version 10.4 LTS.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group