cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

"Run now with different parameters" - different parameters not recognized by jobs involving multiple tasks

swzzzsw
New Contributor III

I'm running a databricks job involving multiple tasks and would like to run the job with different set of task parameters. I can achieve that by edit each task and and change the parameter values. However, it gets very manual when I have a lot of tasks. I'm think of using "run now with different parameters" option and pass in a different job JSON file. However, it looks like no matter what pass into the JSON file at the job level, my tasks still run with the old parameter values.

Does "run now with different parameters" option support changing parameter values for several tasks at once?

3 REPLIES 3

Anonymous
Not applicable

Hello, @Shirly Wang​! My name is Piper, and I'm a moderator for the Databricks community. Welcome to the community and thank you for your question! It's nice to meet you. Let's give the members of the community a chance to respond before we return to your question.

Thanks for your patience.

ally_r
New Contributor II

This is an issue for me too. I am using a multi-stage job calling the same notebook with different values for the parameters. This feels like a bug to me.

erens
New Contributor II

Hello,

I am also facing with the same issue. The problem is described below:

  1. I have a multi-task job. This job consists of multiple "spark_python_task" kind tasks that execute a python script in a spark cluster. This pipeline is created within a CI/CD process and all tasks have their default parameters.
  2. When I want to run this job via Jobs UI by clicking "Run now with different parameters" button, I can see that I can specify only one single parameter set for "spark_python_task". I would like to be able to pass arguments to tasks separately. And when I pass parameters, I am observing that these parameters are passed to all tasks.
  3. Even though it is stated that parameters are merged with the default parameters, I am observing that my parameters are overridden. (reference)

Apart from Jobs UI, I also suffer from the same problem when I want to trigger the job using API (reference).

In short, how can I pass different parameters to tasks and get those parameters merged instead of being overridden in a multi-task environment?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group