โ07-17-2024 01:12 AM
A task may be used to process some data. If we have 10 such tasks in a job and we want to process only a couple of datasets only through a couple of tasks, is that possible?
โ08-01-2024 01:42 AM
Hi @Retired_mod, where within the Databricks Workflows feature is one able to execute a single task of a job in isolation? I don't see any way to control this via the UI, nor the Jobs API.
โ08-01-2024 02:33 AM
No, you do not have any direct way to do it but you can do work around by introducing parameter "Skip_job"(Custom Parameter) in your tasks by default you can make it True in all tasks, when you want to run only one or two of them just adjust the parameter value in Databricks Workflow console for rest of tasks as Skip_job = False, By doing this you could achieve.
โ08-01-2024 03:15 AM
Ah great workaround! Thank you.
โ11-21-2024 03:28 PM
How do you add this custom parameter? Under tasks there is a "parameters" option if one wants to pass parameters to the notebook from the task. I tried adding a "skip_job" key with a "False/True" value in parameters, but when I kick off the task, I am still seeing my other one kick into "pending" state. Please EXACTLY where can I add the "skip_job" custom parameter?
โ12-10-2024 05:03 AM
Is there any work in progress on this feature request? Would be interested in running tasks isolated from others for development.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group