01-24-2022 11:17 AM
I'm running a databricks job involving multiple tasks and would like to run the job with different set of task parameters. I can achieve that by edit each task and and change the parameter values. However, it gets very manual when I have a lot of tasks. I'm think of using "run now with different parameters" option and pass in a different job JSON file. However, it looks like no matter what pass into the JSON file at the job level, my tasks still run with the old parameter values.
Does "run now with different parameters" option support changing parameter values for several tasks at once?
01-25-2022 06:51 AM
Hello, @Shirly Wang! My name is Piper, and I'm a moderator for the Databricks community. Welcome to the community and thank you for your question! It's nice to meet you. Let's give the members of the community a chance to respond before we return to your question.
Thanks for your patience.
08-25-2022 07:36 AM
This is an issue for me too. I am using a multi-stage job calling the same notebook with different values for the parameters. This feels like a bug to me.
12-14-2022 04:32 AM
Hello,
I am also facing with the same issue. The problem is described below:
Apart from Jobs UI, I also suffer from the same problem when I want to trigger the job using API (reference).
In short, how can I pass different parameters to tasks and get those parameters merged instead of being overridden in a multi-task environment?
Monday
Dear Team, For now, I found a solution. Disconnect the bundle source on Databricks, edit the parameters that you want to run. After execution, redeploy your code again from repository.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group