Logs not coming up in the UI - while being written to DBFS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-22-2024 03:07 AM - edited 01-22-2024 03:17 AM
I have a few spark-submit jobs that are being run via Databricks workflows. I have configured logging in DBFS and specified a location in my GCS bucket.
The logs are present in that GCS bucket for the latest run but whenever I try to view them from the UI, it fails and doesn't return anything.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-22-2024 07:08 AM
Are you able to see the logs by using the default cluster settings?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-25-2024 05:47 AM - edited 01-25-2024 05:48 AM
By the default setting, do you mean that I enter the destination as None? instead of defining a DBFS location?
I tried using this too, but still I can't see the logs for that job
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-25-2024 09:53 AM
Yes, I meant to set it to None. Is the issue specific to any particular cluster? Or do you see the issue with all the clusters in your workspace?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-26-2024 02:17 AM - edited 01-26-2024 02:18 AM
So firstly, each task has a separate cluster when it comes to spark-submit jobs, so for those jobs, it is the same behavior across all jobs in the dev workspace that we have.
in the prod workspace, this issue is there for some tasks and not there for other tasks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-26-2024 04:43 AM
It is a strange behavior. It needs a closer look. Could you please raise a support ticket?

