"Run now with different parameters" - different parameters not recognized by jobs involving multiple tasks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ01-24-2022 11:17 AM
I'm running a databricks job involving multiple tasks and would like to run the job with different set of task parameters. I can achieve that by edit each task and and change the parameter values. However, it gets very manual when I have a lot of tasks. I'm think of using "run now with different parameters" option and pass in a different job JSON file. However, it looks like no matter what pass into the JSON file at the job level, my tasks still run with the old parameter values.
Does "run now with different parameters" option support changing parameter values for several tasks at once?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ01-25-2022 06:51 AM
Hello, @Shirly Wangโ! My name is Piper, and I'm a moderator for the Databricks community. Welcome to the community and thank you for your question! It's nice to meet you. Let's give the members of the community a chance to respond before we return to your question.
Thanks for your patience.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ08-25-2022 07:36 AM
This is an issue for me too. I am using a multi-stage job calling the same notebook with different values for the parameters. This feels like a bug to me.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ12-14-2022 04:32 AM
Hello,
I am also facing with the same issue. The problem is described below:
- I have a multi-task job. This job consists of multiple "spark_python_task" kind tasks that execute a python script in a spark cluster. This pipeline is created within a CI/CD process and all tasks have their default parameters.
- When I want to run this job via Jobs UI by clicking "Run now with different parameters" button, I can see that I can specify only one single parameter set for "spark_python_task". I would like to be able to pass arguments to tasks separately. And when I pass parameters, I am observing that these parameters are passed to all tasks.
- Even though it is stated that parameters are merged with the default parameters, I am observing that my parameters are overridden. (reference)
Apart from Jobs UI, I also suffer from the same problem when I want to trigger the job using API (reference).
In short, how can I pass different parameters to tasks and get those parameters merged instead of being overridden in a multi-task environment?