- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2025 11:07 AM
Currently, to investigate job performance, I can look at a job's information (via the UI) to see the "Event Log" (pictured below):
I'd like to obtain this information programmatically, so I can analyze it across jobs. However, the docs for the `get` call (https://docs.databricks.com/api/workspace/jobs/get) do not appear to include information about the event log. Is there a way to get this info via the REST API?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2025 11:17 AM
Hi @mrstevegross,
Unfortunately the API job/get does not have event logs in its output. I will see if there is a workaround, but as far as I can tell from REST API might no be possible.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2025 11:17 AM
Hi @mrstevegross,
Unfortunately the API job/get does not have event logs in its output. I will see if there is a workaround, but as far as I can tell from REST API might no be possible.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2025 11:25 AM
Darn, well, good to know. Can y'all consider this a feature request in that case?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2025 11:18 AM
Can you give me more context on why do you need event logs?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-15-2025 11:26 AM
Sure, I want to assess the overall performance of our Spark jobs, particularly the time between "CREATING" and "RUNNING". It's very time-consuming to gather this data manually via the UI; if there is a way to get it programmatically that would be great.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Followup: I see that there is documentation on exporting Spark logs (https://docs.databricks.com/aws/en/compute/clusters-manage#event-log) and init script logging (https://docs.databricks.com/aws/en/init-scripts/logs#init-script-events). I've been able to export the spark logs itself, but I'm not seeing the job event information there. Is there some way to gather the event log via these mechanisms?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
I also see there is a "list cluster events" API (https://docs.databricks.com/api/workspace/clusters/events); can I get the event log this way?

