I have a parent job that calls multiple child jobs in workflow, Out of 10 child jobs, one has failed and rest 9 are still running, I want to repair the failed child tasks. can I do that while the other child jobs are running?
Hi Community,I am trying to call another job under a workflow job using run_job_task. Currently I am manually providing job_id of the child job. I want to know if there is any way to pass job_name instead of run_id. This will automate the deployment ...
Hi Community, I am trying to run a Databricks workflow job using run_job_task under a for_loop. I have set the concurrent jobs as 2. I can see 2 iteration jobs getting triggered successfully. But both fail with an error:"ConnectException: Connection ...
Yes, that is what I am currently doing. I wanted to know if there is any way to repair the failed ones while the parent job is running so that I don't need to wait which does not seem possible as of today.
Were you able to find any solution to the problem?I am also having the similar use-case where I need to run multiple run_job_task and everytime it is spinning up new cluster of its own that is defined in child job.I am not able to find any relevant s...