The stages you mentionedโJob Started, Waiting for Cluster, Cluster Ready, Started Running, Succeededโare Databricks job lifecycle events, not Spark events.
They are stored in Databricks internal job service, not in the driver/executor logs. You can access them via:
Jobs UI โ View Run Events (what you already did)
Databricks REST API:
Use the https://docs.databricks.com/api/azure/workspace/jobs/getrun to retrieve detailed lifecycle events programmatically.
If you want to persist these lifecycle logs:
You need to export them via API and then write them to your volume or external storage.
The driver/event/executor logs will only show Spark-related execution details, not cluster provisioning or job trigger events.