cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

"Run now with different parameters" - different parameters not recognized by jobs involving multiple tasks

swzzzsw
New Contributor III

I'm running a databricks job involving multiple tasks and would like to run the job with different set of task parameters. I can achieve that by edit each task and and change the parameter values. However, it gets very manual when I have a lot of tasks. I'm think of using "run now with different parameters" option and pass in a different job JSON file. However, it looks like no matter what pass into the JSON file at the job level, my tasks still run with the old parameter values.

Does "run now with different parameters" option support changing parameter values for several tasks at once?

5 REPLIES 5

Anonymous
Not applicable

Hello, @Shirly Wang​! My name is Piper, and I'm a moderator for the Databricks community. Welcome to the community and thank you for your question! It's nice to meet you. Let's give the members of the community a chance to respond before we return to your question.

Thanks for your patience.

Kaniz
Community Manager
Community Manager

hi @Shirly Wang​ , To Run a job with different parameters,

You can use Run Now with Different Parameters to re-run a job with different parameters or different values for existing parameters.

  1. Click next to Run Now and select Run Now with Different Parameters or, in the Active Runs table, click Run Now with Different Parameters. Enter the new parameters depending on the type of task.
  • Notebook: You can enter parameters as key-value pairs or a JSON object. You can use this dialogue to set the values of widgets.
    • JAR and spark-submit: You can enter a list of parameters or a JSON document. The provided parameters are merged with the default parameters for the triggered run. If you delete keys, the default parameters are used. You can also add task parameter variables for the run.

2. Click Run.

Source

ally_r
New Contributor II

This is an issue for me too. I am using a multi-stage job calling the same notebook with different values for the parameters. This feels like a bug to me.

Kaniz
Community Manager
Community Manager

Hi @Alastair Reilly​ , I was checking to see if my suggestions helped you.

Or else, If you have any solution, please share it with the community as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

erens
New Contributor II

Hello,

I am also facing with the same issue. The problem is described below:

  1. I have a multi-task job. This job consists of multiple "spark_python_task" kind tasks that execute a python script in a spark cluster. This pipeline is created within a CI/CD process and all tasks have their default parameters.
  2. When I want to run this job via Jobs UI by clicking "Run now with different parameters" button, I can see that I can specify only one single parameter set for "spark_python_task". I would like to be able to pass arguments to tasks separately. And when I pass parameters, I am observing that these parameters are passed to all tasks.
  3. Even though it is stated that parameters are merged with the default parameters, I am observing that my parameters are overridden. (reference)

Apart from Jobs UI, I also suffer from the same problem when I want to trigger the job using API (reference).

In short, how can I pass different parameters to tasks and get those parameters merged instead of being overridden in a multi-task environment?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.