cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Want to see logs for lineage view run events

jitendrajha11
New Contributor

Hi All,

I need your help, as I am running jobs it is getting successful, when I click on job and there we can find lineage > View run events option when click on it. I see below steps.

  1. Job Started: The job is triggered.
  2. Waiting for Cluster: The job waits for the cluster to be ready.
  3. Cluster Ready: The cluster becomes ready to execute the job.
  4. Started Running: The job starts running.
  5. Succeeded: The job completes successfully after processing the data.

    I want to see all 5 stages logs, where I will see it in detail. I am going to stores logs in volume, there i am able to see driver, eventlog, executor etc. in which folder they are storing because I have checked all logs but not able to see any information. 

4 REPLIES 4

bianca_unifeye
New Contributor III

https://docs.databricks.com/aws/en/jobs/monitor#export-job-runs

In the article look for job export

For the compute:

  1. On the compute page, click the Advanced toggle.
  2. Click the Logging tab.
  3. Select a destination type.
  4. Enter the Log path.

https://docs.databricks.com/aws/en/compute/configure

nayan_wylde
Esteemed Contributor

The stages you mentioned—Job Started, Waiting for Cluster, Cluster Ready, Started Running, Succeeded—are Databricks job lifecycle events, not Spark events.
They are stored in Databricks internal job service, not in the driver/executor logs. You can access them via:

Jobs UI → View Run Events (what you already did)
Databricks REST API:
Use the https://docs.databricks.com/api/azure/workspace/jobs/getrun to retrieve detailed lifecycle events programmatically.

If you want to persist these lifecycle logs:

You need to export them via API and then write them to your volume or external storage.

The driver/event/executor logs will only show Spark-related execution details, not cluster provisioning or job trigger events.

 

 

 

 

in Jobs UI → View Run Events I am not able to see anything please find the attachment and provide information step by step 

jitendrajha11
New Contributor

Hi Team/Member,

As I am running jobs it is getting successful, when I click on job and there we can find lineage > View run events option when click on it. We find below steps and also added screenshot of it. I want screenshot stages logs, where i will find logs for stages in the screenshot. 

  1. Job Started: The job is triggered.
  2. Waiting for Cluster: The job waits for the cluster to be ready.
  3. Cluster Ready: The cluster becomes ready to execute the job.
  4. Started Running: The job starts running.
  5. Succeeded: The job completes successfully after processing the data.