Custom JobGroup in Spark UI for cluster with multiple executions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2024 08:01 AM
Does anyone know what the first digits of the jobgroup that are shown in the spark ui mean when using all purpose clusters to launch multiple jobs?
Right now the pattern is something like: [id_random]_job_[jod_id]_run-[run_id]_action_[action].
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-05-2024 09:46 PM
Hi @FerArribas
The first digits of the jobgroup that are shown in the spark UI are execContextId and cmdId(Command_ID).
You can think of the execContextId
as some kind of “REPL ID”
For example, if you take the below job group ID as an example,
jobGroupId: Option[String] = Some(3482599387390558128_6163657922384662685_job-532427916265395-run-416293225333467-action-6638250854918794)
execContextId = 3482599387390558128
cmdId = 6163657922384662685
executionId = job-532427916265395-run-416293225333467-action-6638250854918794

