cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to disable task in databricks job workflow

shubbansal27
New Contributor II

I don't see any option to disable the task. I have created one workflow which contains multiple tasks.

sometime i need to disable some task for my experiment.  but there is no straightforward way for this.  I feel "disable" button is must have option for task settings.

 

currently i am using one workaround but this is not the good way. please suggest the fix 

whenever i need to disable the task. i just add "dbutils.notebook.exit" function call in the top of task notebook to skip the further execution.  but this doesn't prevents the cluster to spawn. this is wastage of resource and also adds delay in workflow. 

 

 

9 REPLIES 9

User16539034020
Databricks Employee
Databricks Employee

Hi, shubbansal27:

Thanks for contacting Databricks Support!

At the moment, this can only be achieved using the if/else (conditional) task type. For this, you would need to establish a variable to denote whether you want a task to be paused and the task type to implement the verification before running the task. The promising news is that our engineering team is actively exploring ways to simplify this process. You can anticipate a more user-friendly feature to be introduced in the near future.

Monica

Hi,

I am also facing a similar issue where I want to disable tasks according to the environment where the job runs.For ex : if the environment is dev ,run only Task 1  else run Task 2 and Task3.

Could you please help me understand how that variable can be used to pause the task  ?Currently, I am not able to find any options that will disable the task.

Can you let us know when this " more user-friendly feature" to disable a specific task in a workflow will be available?

User16539034020
Databricks Employee
Databricks Employee

Hi, shubbansal27:
Thanks for contacting Databricks Support!

At the moment, this can only be achieved using the if/else (conditional) task type. For this, you would need to establish a variable to denote whether you want a task to be paused and the task type to implement the verification before running the task. The promising news is that our engineering team is actively exploring ways to simplify this process. You can anticipate a more user-friendly feature to be introduced in the near future.

Monica Cao

Siebert_Looije
Contributor

Hi,
Thanks for your question. Right now there is no way of doing this within the workflow.

So the way we are doing this now, is passing through a task value in the workflow and checking the task value at the beginning of the notebook and use the dbutils.notebook.exit to 

Hi Siebert, 

Thanks for reply on my post.    your understanding is correct.  this way is skipping the task to run further.. but this doesn't stop the cluster to spawn which i have assigned for the task.

Actually i want to attach different run schedule to individual task but currently schedule is at job level. 

ex:  in my workflow, there are ML training tasks and prediction tasks.  i dont want to run training tasks daily.  

 

 

 

 

 

Anonymous
Not applicable

Hi @shubbansal27 

We haven't heard from you since the last response from @Siebert_Looije , and I was checking back to see if her suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others. 

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Hi Vidula,

I am looking for solution for schedule option at task level.  or task disable option. 

VickiT
New Contributor II

I know this post was a while ago, but I've hit this issue, and come up with using if/else conditions between the tasks with a Job parameter of "IsActiveTask2" with a True or False, and then I can have the Task2 dependant on the success of this conditional step being True. I hope that makes sense?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group