01-22-2024 03:07 AM - edited 01-22-2024 03:17 AM
I have a few spark-submit jobs that are being run via Databricks workflows. I have configured logging in DBFS and specified a location in my GCS bucket.
The logs are present in that GCS bucket for the latest run but whenever I try to view them from the UI, it fails and doesn't return anything.
01-22-2024 07:08 AM
Are you able to see the logs by using the default cluster settings?
01-25-2024 05:47 AM - edited 01-25-2024 05:48 AM
By the default setting, do you mean that I enter the destination as None? instead of defining a DBFS location?
I tried using this too, but still I can't see the logs for that job
01-25-2024 09:53 AM
Yes, I meant to set it to None. Is the issue specific to any particular cluster? Or do you see the issue with all the clusters in your workspace?
01-26-2024 02:17 AM - edited 01-26-2024 02:18 AM
So firstly, each task has a separate cluster when it comes to spark-submit jobs, so for those jobs, it is the same behavior across all jobs in the dev workspace that we have.
in the prod workspace, this issue is there for some tasks and not there for other tasks
01-26-2024 04:43 AM
It is a strange behavior. It needs a closer look. Could you please raise a support ticket?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group