Running child job under parent job using run_job_task
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 11:56 PM
Hi Community,
I am trying to call another job under a workflow job using run_job_task. Currently I am manually providing job_id of the child job. I want to know if there is any way to pass job_name instead of run_id. This will automate the deployment across multiple workspaces.
Another workaround for me is to extract the job_id of the given job_name from the workspace and update it in the parent job yaml configuration during the deployment.
I am using github action for this. Any suggestions?
- Labels:
-
Workflows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-19-2024 01:21 AM - edited 12-19-2024 01:22 AM
Hi @holychs ,
It is possible to do using lookup in Databricks Asset Bundles.
You define the job id variable that finds id of the job based on its name and use this variable to specify job_id in the run_job_task. Here is the code:
variables:
my_job_id:
description: id of the job
lookup:
job: "my_job"
resources:
jobs:
my_job:
name: my_job
tasks:
- task_key: my_task
notebook_task:
notebook_path: <notebook path>
existing_cluster_id: "<cluster id>"
parent_job:
name: parent_job
tasks:
- task_key: run_child_job
run_job_task:
job_id: ${var.my_job_id}
job_parameters:
my_param_value: my_param_value
targets:
dev:
mode: development
default: true
workspace:
host: <workspace url>
The above code correctly deployed a job:

