cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Schedule job to run sequentially after another job

deep_thought
Contributor

Is there a way to schedule a job to run after some other job is complete?

E.g. Schedule Job A, then upon it's completion run Job B.

1 ACCEPTED SOLUTION

Accepted Solutions

youssefmrini
Databricks Employee
Databricks Employee

There is an upcoming preview where you can run a Job as a Task and then you can chain one job after another one.

View solution in original post

16 REPLIES 16

Ajay-Pandey
Esteemed Contributor III

Hi @_ _​ yes you can do the same, but you have to use workflow API to trigger another Job

Ajay Kumar Pandey

Harun
Honored Contributor

Hi @_ _​ ,

Under the job section, create your first task, then add dependent task (attached snapshot for your reference).

image 

Hi, I am aware you can add multiple tasks to a job, but this is not the question I have asked

Geeta1
Valued Contributor

@_ _​ you can create two tasks in the Jobs section. Second job runs only after the first job is done.

Hi, I am aware you can add multiple tasks to a single job, but this is not the question I have asked

youssefmrini
Databricks Employee
Databricks Employee

There is an upcoming preview where you can run a Job as a Task and then you can chain one job after another one.

Sounds great, I will look out for it

I can't find this anywhere, has it been released yet? Or is planned to still?

@Youssef Mrini​  any update if this feature is now available in Databricks

youssefmrini
Databricks Employee
Databricks Employee

It will be available soon in a public preview. It has been announced at the roadmap webinar

is there any tentative date? very useful feature.

ramravi
Contributor II

You can add dependency task under tasks section of Job. Dependency task is another databricks job

ranged_coop
Valued Contributor II

For now guess calling tasks individually is the only option - or maybe call the relevant notebooks in a separate notebook with each cell calling on a notebook ?

ranged_coop
Valued Contributor II

On the different note, I hope databricks introduces failure dependency in its workflow i.e. call a job if the dependent job fails and custom e-mail with attachments...

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group