cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Get run details of a databricks job that provides similar data without using api '/api/2.0/jobs/runs

mannepk85
New Contributor III

I have a notebook, which is attached to a task at the end of a job. This task will pull the status of all other tasks in the job and checks if they are success or failure. Depending on the result, this last task will send a slack notification (custom one) to the channel. Below is my code.

 

 

 

def get_job_run_details(run_id):
    url = job_run_url
    headers = {
        "Authorization": f"Bearer {api_token}"
    }
    params = {
        "run_id": run_id
    }
    response = requests.get(url, headers=headers, params=params)
    return response.json()

# Get job run details
run_details = get_job_run_details(run_id)
tasks = run_details.get('tasks', [])
# Check the status of all tasks
job_status = "completed successfully"
for task in tasks:
    if 'state' in task and 'result_state' in task['state']:
        if task['state']['result_state'] != "SUCCESS":
            job_status = "failed"
            break

 

 

Now the problem is the bearer token. I am using a personal auth token. the documentation to generate a token using client_id/secret isn't very clear. I cannot use personal token in production. I need to generate and refresh bearer token programatically. 

I was thinking, is there another way to fetch the run details without even using the API. the Service principal should have access to the meta data of the current job and there should be some way to access that metadata without having to go through the API route. 

DBUtils also didn't help me. 

2 REPLIES 2

szymon_dybczak
Contributor III

Hi @mannepk85 ,

You can take a look on jobs system table. Notice though, that it is in public preview now so use it with caution: 

 

https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs

This requires the user running the job to have access to system.workflow.jobs. I can make it work in non-prod but for prod, this approach doesn't work.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group