cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Is it possible to obtain a job's event log via the REST API?

mrstevegross
Contributor III

Currently, to investigate job performance, I can look at a job's information (via the UI) to see the "Event Log" (pictured below):

mrstevegross_0-1736967992555.png

I'd like to obtain this information programmatically, so I can analyze it across jobs. However, the docs for the `get` call (https://docs.databricks.com/api/workspace/jobs/get) do not appear to include information about the event log. Is there a way to get this info via the REST API?

1 ACCEPTED SOLUTION

Accepted Solutions

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @mrstevegross,

Unfortunately the API job/get does not have event logs in its output. I will see if there is a workaround, but as far as I can tell from REST API might no be possible.

View solution in original post

6 REPLIES 6

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @mrstevegross,

Unfortunately the API job/get does not have event logs in its output. I will see if there is a workaround, but as far as I can tell from REST API might no be possible.

Darn, well, good to know. Can y'all consider this a feature request in that case?

Alberto_Umana
Databricks Employee
Databricks Employee

Can you give me more context on why do you need event logs?

mrstevegross
Contributor III

Sure, I want to assess the overall performance of our Spark jobs, particularly the time between "CREATING" and "RUNNING". It's very time-consuming to gather this data manually via the UI; if there is a way to get it programmatically that would be great.

mrstevegross
Contributor III

Followup: I see that there is documentation on exporting Spark logs (https://docs.databricks.com/aws/en/compute/clusters-manage#event-log) and init script logging (https://docs.databricks.com/aws/en/init-scripts/logs#init-script-events). I've been able to export the spark logs itself, but I'm not seeing the job event information there. Is there some way to gather the event log via these mechanisms?

mrstevegross
Contributor III

I also see there is a "list cluster events" API (https://docs.databricks.com/api/workspace/clusters/events); can I get the event log this way?