โ07-15-2023 11:19 PM
I don't see any option to disable the task. I have created one workflow which contains multiple tasks.
sometime i need to disable some task for my experiment. but there is no straightforward way for this. I feel "disable" button is must have option for task settings.
currently i am using one workaround but this is not the good way. please suggest the fix
whenever i need to disable the task. i just add "dbutils.notebook.exit" function call in the top of task notebook to skip the further execution. but this doesn't prevents the cluster to spawn. this is wastage of resource and also adds delay in workflow.
โ07-19-2023 10:02 AM
Hi, shubbansal27:
Thanks for contacting Databricks Support!
At the moment, this can only be achieved using the if/else (conditional) task type. For this, you would need to establish a variable to denote whether you want a task to be paused and the task type to implement the verification before running the task. The promising news is that our engineering team is actively exploring ways to simplify this process. You can anticipate a more user-friendly feature to be introduced in the near future.
Monica
โ02-07-2024 03:23 AM
Hi,
I am also facing a similar issue where I want to disable tasks according to the environment where the job runs.For ex : if the environment is dev ,run only Task 1 else run Task 2 and Task3.
Could you please help me understand how that variable can be used to pause the task ?Currently, I am not able to find any options that will disable the task.
โ05-30-2024 08:31 AM
Can you let us know when this " more user-friendly feature" to disable a specific task in a workflow will be available?
โ07-19-2023 10:14 AM
Hi, shubbansal27:
Thanks for contacting Databricks Support!
At the moment, this can only be achieved using the if/else (conditional) task type. For this, you would need to establish a variable to denote whether you want a task to be paused and the task type to implement the verification before running the task. The promising news is that our engineering team is actively exploring ways to simplify this process. You can anticipate a more user-friendly feature to be introduced in the near future.
Monica Cao
โ07-22-2023 02:21 AM
Hi,
Thanks for your question. Right now there is no way of doing this within the workflow.
So the way we are doing this now, is passing through a task value in the workflow and checking the task value at the beginning of the notebook and use the dbutils.notebook.exit to
โ07-22-2023 10:56 PM - edited โ07-22-2023 10:57 PM
Hi Siebert,
Thanks for reply on my post. your understanding is correct. this way is skipping the task to run further.. but this doesn't stop the cluster to spawn which i have assigned for the task.
Actually i want to attach different run schedule to individual task but currently schedule is at job level.
ex: in my workflow, there are ML training tasks and prediction tasks. i dont want to run training tasks daily.
โ07-22-2023 09:05 PM
We haven't heard from you since the last response from @Siebert_Looije , and I was checking back to see if her suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
โ07-22-2023 10:59 PM
Hi Vidula,
I am looking for solution for schedule option at task level. or task disable option.
3 weeks ago
I know this post was a while ago, but I've hit this issue, and come up with using if/else conditions between the tasks with a Job parameter of "IsActiveTask2" with a True or False, and then I can have the Task2 dependant on the success of this conditional step being True. I hope that makes sense?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group