cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Download event, driver, and executor logs

RajeshRK
Contributor

Hi Team,

I can see logs in Databricks console by navigating workflow -> job name -> logs. These logs are very generic like stdout, stderr and log4-avtive.log.

How to download event, driver, and executor logs at once for a job?

Regards,

Rajesh.

1 ACCEPTED SOLUTION

Accepted Solutions

RajeshRK
Contributor

@Kaniz Fatmaโ€‹ @John Lourduโ€‹ @Vidula Khannaโ€‹ 

Hi Team,

I managed to download logs using the Databricks command line as below:

  1. Installed the Databricks command line on my Desktop (pip install databricks-cli)
  2. Configured the Databricks cluster URL and personal token. (databricks configure --token)
  3. Followed the below procedure to download the logs

1. Get the job run id using below command
# databricks runs list | grep -i running
 
2. Identify the cluster id using the run id
# databricks clusters list | grep <run id from the above command>
 
3. Copied the logs from Databricks cluster to my local desktop
# databricks fs cp -r <databricks log location/<cluster id got from the above command>  <location in my desktop>

Regards,

Rajesh.

View solution in original post

7 REPLIES 7

Anonymous
Not applicable

@Rajesh Kannan Rโ€‹ :

To download event, driver, and executor logs at once for a job in Databricks, you can follow these steps:

  1. Navigate to the "Jobs" section of the Databricks workspace.
  2. Click on the job name for which you want to download logs.
  3. Click on the "Logs" tab to view the logs for the job.
  4. Scroll down to the "Log Storage" section and click on the "Download Logs" button.
  5. In the "Download Logs" dialog box, select the logs you want to download, such as "Driver Logs", "Executor Logs", and "Event Logs".
  6. Specify the time range for the logs and the format in which you want to download them.
  7. Click on the "Download" button to download the logs in a ZIP file.

Note that the download logs feature is available in Databricks Enterprise Edition and Databricks Community Edition. However, the specific log files available for download may vary depending on the Databricks cluster configuration and the permissions of the user running the job.

Hi Teja,

Thank you for replying.

From Databricks Workspace

1) First, I navigated to Workflows -> Jobs and then searched for the job

2) Opened the job

3) Clicked the โ€œLogsโ€ and then directed to โ€œSpark Driver Logsโ€.

4) There is no option for "Log Storage". I have attached the screenshot.

Regards,

Rajesh.

mtz629
New Contributor II

This simply isn't a thing

Anonymous
Not applicable

Hi @Rajesh Kannan Rโ€‹ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Anonymous
Not applicable

@Rajesh Kannan Rโ€‹ You can configure cluster log delivery in job or interactive clusters. You can select DBFS or S3/ADLS as the destination. Once configured all logs including drive, executor logs, eventlog will be delivered to the destination. You can use replay the sparkUI using eventlog if the sparkui is not loading for the job after cluster is terminated.

https://docs.databricks.com/archive/compute/configure.html#cluster-log-delivery-1

https://kb.databricks.com/clusters/replay-cluster-spark-events?from_search=113068791

@John Lourduโ€‹ @Kaniz Fatmaโ€‹ @Vidula Khannaโ€‹ 

Hi Team,

We use job cluster, and logs default to file system DBFS. The cluster is terminated immediately after the job execution. Are there any ways to download the logs from DBFS from the terminated cluster?

I am thinking of addressing it by using the below options:

  1. Not to terminate the cluster immediately
  2. copy the logs from DBFS to ADLS
  3. terminate the cluster.

Regards,

Rajesh.

RajeshRK
Contributor

@Kaniz Fatmaโ€‹ @John Lourduโ€‹ @Vidula Khannaโ€‹ 

Hi Team,

I managed to download logs using the Databricks command line as below:

  1. Installed the Databricks command line on my Desktop (pip install databricks-cli)
  2. Configured the Databricks cluster URL and personal token. (databricks configure --token)
  3. Followed the below procedure to download the logs

1. Get the job run id using below command
# databricks runs list | grep -i running
 
2. Identify the cluster id using the run id
# databricks clusters list | grep <run id from the above command>
 
3. Copied the logs from Databricks cluster to my local desktop
# databricks fs cp -r <databricks log location/<cluster id got from the above command>  <location in my desktop>

Regards,

Rajesh.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group