cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Show full logs on job log

Kaz
New Contributor II

Is it possible to show the full logs of a databricks job? Currently, the logs are skipped with:

*** WARNING: max output size exceeded, skipping output. ***

However, I don't believe our log files are more than 20 MB. I know you can press the logs button and view the full logs there. However, due to admin rights we would like to be able to read all logs in the job overview, without having access to the logs (and therefore other) button(s). 

 

 

3 REPLIES 3

piotrsofts
New Contributor III

Hey Kaz

Where did you get that ?

GalenSwint
New Contributor II

I am having the same problem as Kaz. What's weird is I have another cluster where that doesn't happen. I didn't set up the other cluster but wondered if it had some sort of setting for this that I could change.

Isi
Honored Contributor II

Hey @Kaz ,

Unfortunately, the output truncation limit in the Databricks job UI cannot be changed. Once that limit is exceeded, the rest of the logs are skipped, and the full logs become accessible only through the โ€œLogsโ€ button, which, as you mentioned, requires elevated permissions that might not be available in your case. You could ask for "Can View" permissions

 

Captura de pantalla 2024-10-05 a las 15.58.13.png

A good workaround, especially when access to cluster logs is restricted, is to redirect important log output to a Unity Catalog volume during the job execution. This allows you to bypass the UI limitations and store the logs in a location that can be accessed later through standard permissions.

By writing logs to a volume, you:

  • Avoid losing output due to UI truncation

  • Persist logs in a governed and centralized location

  • Maintain separation from cluster internals โ€” users can read the logs without needing access to the job or cluster itself


"Delivering logs to volumes is in Public Preview and is only supported on Unity-Catalog-enabled compute with Standard access mode or Dedicated access mode assigned to a user. This feature is not supported on compute with Dedicated access mode assigned to a group. If you select a volume as the path, ensure you have the READ VOLUME and WRITE VOLUME permissions on the volume" 

https://learn.microsoft.com/en-us/azure/databricks/compute/configure#cluster-log-delivery

Hope this helps ๐Ÿ™‚

Isi