Use Case:
Updating a Databricks job with multiple tasks can be time-consuming and error-prone when changes (such as adding new parameters) need to be applied to each task manually.
Possible Solutions:
1. Using Databricks CLI โ jobs reset command
You can simplify the update process by resetting the job using a modified JSON definition:
Command:
databricks jobs reset --json @path/to/reset.json
Steps:
In the Databricks Job UI, open the job.
Click the kebab menu (three dots) next to Run now and select View as Code.
Choose the JSON tab and select the Reset option.
Download the JSON definition.
Modify the JSON as needed (e.g., add new parameters to each task).
Run the CLI command above to apply the changes.
This method allows you to overwrite the entire job configuration efficiently using a single command.


2. Using Job-Level Parameters (If Same Across All Tasks)
If the new parameter is common to all tasks, consider adding it at the job level:
Go to the Job details pane and click Edit parameters.
In the Job parameters dialog, add or update parameters using the Key and Value fields.
This eliminates the need to update each task individually.