Hi @dbuserng,
It is possible but it requires custom code setup based on your use-case. You can use job REST API: https://docs.databricks.com/api/workspace/jobs.
Create a Monitoring Job: Set up a job that will monitor the completion status of the three workflows. This job will periodically check if all three workflows have completed successfully.
Use the Jobs API: Utilize the Databricks Jobs API to check the status of the workflows. You can use the jobs/runs/list endpoint to get the status of the runs for each workflow.
Conditional Trigger: In the monitoring job, implement logic to check if all three workflows have completed successfully. If they have, trigger the desired job.
Here is a high-level example of how you can implement this:
import requests
# Define the job IDs for the three workflows
workflow_ids = [workflow1_id, workflow2_id, workflow3_id]
# Function to check the status of a workflow
def check_workflow_status(job_id):
response = requests.get(f'https://<databricks-instance>/api/2.0/jobs/runs/list?job_id={job_id}')
runs = response.json().get('runs', [])
if runs and runs[0]['state']['life_cycle_state'] == 'TERMINATED' and runs[0]['state']['result_state'] == 'SUCCESS':
return True
return False
# Check the status of all workflows
all_completed = all(check_workflow_status(job_id) for job_id in workflow_ids)
# If all workflows are completed successfully, trigger the next job
if all_completed:
trigger_response = requests.post('https://<databricks-instance>/api/2.0/jobs/run-now', json={"job_id": next_job_id})
if trigger_response.status_code == 200:
print("Next job triggered successfully")
else:
print("Failed to trigger the next job")