Databricks driver logs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-15-2023 04:55 AM
I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluster terminates, I am unable to see those logs. I get below screen
- Labels:
-
Databricks Clusters
-
DriverLogs
-
Logs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-15-2023 11:13 PM
Hi, when you say the job is finished and the cluster terminates, can you still list the cluster in the cluster UI page? Also, could you please uncheck the auto-fetch logs?
Until and unless the cluster is deleted the logs should be there.
Please tag @Debayan with your next response which will notify me, Thank you!

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-18-2023 12:23 AM
Hi @Atul Arora
Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.
Please help us select the best solution by clicking on "Select As Best" if it does.
Your feedback will help us ensure that we are providing the best possible service to you.
Thank you!

