2 weeks ago
Hi community,
How to overwritten the job parameter inside the job task? Because seems that the job parameter has a higher priority than a task parameter although it is overwritten
2 weeks ago
Hi @jeremy98 ,
It's exactly as you wrote. Job parameters take precedence over task parameters. If a job parameter and a task parameter have the same key, the job parameter overrides the task parameter and you can't change that behavior.
But you can apply workaround. Define task parameter with a different key and then write some code in the notebook that will handle this. For instance your code can check if job parameter was provided and task parameter was also provided them take task parameter
https://docs.databricks.com/aws/en/jobs/job-parameters#job-parameter-pushdown
2 weeks ago
hey @jeremy98
This might be of some use:
https://docs.databricks.com/aws/en/jobs/run-now#with-different-params
You can start a job with different settings/job parameters, but continuous jobs will always keep the default of a single concurrent run.
2 weeks ago
Hello @jeremy98
I'm not certain I've understood your question, but here are my thoughts on job parameters:
Job parameters are global, meaning they persist in each individual task, inside of the main job. Task parameters are not global and therefore persist only within each individual task. If you have a task parameter call the same name as the job parameter, the job parameter will take priority. The task parameter will not be able to overwrite the job parameter.
I hope this helps.
Regards - Pilsner
2 weeks ago
Hi,
Thanks for your answer but I mean, I have 4 tasks of ingestion that took a key parameter from the job parameter definition level. But now I want that 3 of these tasks consider to take the same key parameter but the last task a new overwritten..
Something like this:
jobs:
job1:
name: job1
run_as:
service_principal_name: ${var.SP_DATABRICKS_ID_STG}
parameters:
- name: key_parameter
default: stg
task:
task:
name: task_1
notebook_task:
...
base_parameter:
key_parameter: {{jobs.parameters.key_parameter}}
task:
name: task_2
notebook_task:
...
base_parameter:
key_parameter: "another_value"
2 weeks ago - last edited 2 weeks ago
Hello @jeremy98,
Thank you for providing this, I think I'm with you now.
If you want to be able to pass a value from one task to another, (similarly to how job parameters can be passed to all tasks), you can define task values.
https://docs.databricks.com/aws/en/jobs/task-values
Using these you should be able to dynamically calculate the value in one task, then call upon it, inside a subsequent task.
Hopefully, this is more useful than my original answer ๐
Regards - Pilsner
2 weeks ago
Hi @jeremy98 ,
It's exactly as you wrote. Job parameters take precedence over task parameters. If a job parameter and a task parameter have the same key, the job parameter overrides the task parameter and you can't change that behavior.
But you can apply workaround. Define task parameter with a different key and then write some code in the notebook that will handle this. For instance your code can check if job parameter was provided and task parameter was also provided them take task parameter
https://docs.databricks.com/aws/en/jobs/job-parameters#job-parameter-pushdown
2 weeks ago
Hi syz:),
Yes ahah it's weird in my opinion, but yes your solution is fine!
2 weeks ago
Hi Pilsner,
Thanks for your response, the issue is that I need to know it before. In this case, we need to set inside a notebook for example the task values. I want to be able to set it at task value. I think it is not provided from Databricks.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now