โ03-17-2023 04:06 AM
Hi Team,
I can see logs in Databricks console by navigating workflow -> job name -> logs. These logs are very generic like stdout, stderr and log4-avtive.log.
How to download event, driver, and executor logs at once for a job?
Regards,
Rajesh.
โ03-21-2023 02:13 AM
@Kaniz Fatmaโ @John Lourduโ @Vidula Khannaโ
Hi Team,
I managed to download logs using the Databricks command line as below:
1. Get the job run id using below command
# databricks runs list | grep -i running
2. Identify the cluster id using the run id
# databricks clusters list | grep <run id from the above command>
3. Copied the logs from Databricks cluster to my local desktop
# databricks fs cp -r <databricks log location/<cluster id got from the above command> <location in my desktop>
Regards,
Rajesh.
โ03-17-2023 08:36 AM
@Rajesh Kannan Rโ :
To download event, driver, and executor logs at once for a job in Databricks, you can follow these steps:
Note that the download logs feature is available in Databricks Enterprise Edition and Databricks Community Edition. However, the specific log files available for download may vary depending on the Databricks cluster configuration and the permissions of the user running the job.
โ03-17-2023 08:56 AM
Hi Teja,
Thank you for replying.
From Databricks Workspace
1) First, I navigated to Workflows -> Jobs and then searched for the job
2) Opened the job
3) Clicked the โLogsโ and then directed to โSpark Driver Logsโ.
4) There is no option for "Log Storage". I have attached the screenshot.
Regards,
Rajesh.
โ11-08-2024 10:08 AM
This simply isn't a thing
โ03-17-2023 11:17 PM
Hi @Rajesh Kannan Rโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
โ03-18-2023 06:35 AM
@Rajesh Kannan Rโ You can configure cluster log delivery in job or interactive clusters. You can select DBFS or S3/ADLS as the destination. Once configured all logs including drive, executor logs, eventlog will be delivered to the destination. You can use replay the sparkUI using eventlog if the sparkui is not loading for the job after cluster is terminated.
https://docs.databricks.com/archive/compute/configure.html#cluster-log-delivery-1
https://kb.databricks.com/clusters/replay-cluster-spark-events?from_search=113068791
โ03-20-2023 11:46 PM
@John Lourduโ @Kaniz Fatmaโ @Vidula Khannaโ
Hi Team,
We use job cluster, and logs default to file system DBFS. The cluster is terminated immediately after the job execution. Are there any ways to download the logs from DBFS from the terminated cluster?
I am thinking of addressing it by using the below options:
Regards,
Rajesh.
โ03-21-2023 02:13 AM
@Kaniz Fatmaโ @John Lourduโ @Vidula Khannaโ
Hi Team,
I managed to download logs using the Databricks command line as below:
1. Get the job run id using below command
# databricks runs list | grep -i running
2. Identify the cluster id using the run id
# databricks clusters list | grep <run id from the above command>
3. Copied the logs from Databricks cluster to my local desktop
# databricks fs cp -r <databricks log location/<cluster id got from the above command> <location in my desktop>
Regards,
Rajesh.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group