โ09-14-2023 01:44 AM
Parameters can be passed to Tasks and the values can be retrieved with:
More recently, we have been given the ability to add parameters to Jobs.
However, the parameters cannot be retrieved like Task parameters.
Question: How can we retrieve job-level parameters in notebook code?
โ04-25-2024 11:46 AM
@Retired_mod This method works for Task parameters. Is there a way to access Job parameters that apply to the entire workflow, set under a heading like this in the UI:
I am able to read Job parameters in a different way from Task parameters using dynamic value references:
{{tasks.[task_name].values.[value_name]}}
vs.
{{job.parameters.[name]}}
This works for reading Parameters in the Workflow itself, such as:
Is there an analogous way to read a Job parameter within a Notebook? The Note on this page seems to indicate that these dynamic value references are available in Notebooks, but how do you reference them in Python?
โ04-25-2024 12:10 PM
A coworker has answered this question for me, posting it for anyone else looking for an answer:
run_parameters = dbutils.notebook.entry_point.getCurrentBindings()
context = json.loads(dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson())
job_param = run_parameters[key]
โ05-03-2024 07:49 AM
an update to my answer: Databricks has advised us that the `dbutils.notebook.entry_point` method is not supported (could be deprecated), and the recommended way to read in a job parameter is through widgets, i.e. `dbutils.widgets.get("param_key")` (similar to Task parameters -- if you have a Task param and Job param with the same name, the Job param value takes precedence).
If you want to read a dynamic value reference like the run_id inside a Notebook, you can set the reference (e.g. `{{job.run_id}}`) a Task or Job param, and read it like a widget.
โ06-21-2024 03:08 AM
I think `dbutils.widgets.get` is for notebooks only, not for python job as asked by this thread
โ06-21-2024 03:31 AM
ah sorry, the thread asked for notebooks too.
nevertheless, I'm search for getting job params in pure python jobs
โ07-12-2024 01:51 PM
You need to push down your parameters to a task level. Eg:
โ07-12-2024 02:36 PM
Thanks !
2 weeks ago
How do I read the task level parameter using argparser?
2 weeks ago
The task-level parameter is not very useful, in my opinion, because it is hardcoded and not a real parameter. In such cases, I often use a config.py file to define all task level parameters directly within Python as a configuration.
However, job-level parameters are really useful since we can change their values dynamically when manually triggering a job run.
I'm not aware if Databricks' dbutils provides a built-in method to directly return the current job parameters, but we can work around this by querying the Jobs API.
At the very beginning of the task's Python code:
From anywhere in your code, you can now access the current job parameters from the environment variables without using argparse.
For steps 1โ4, you could write a function to encapsulate the entire process.
โ08-06-2024 02:01 PM
The only thing that has worked for me consistently in python is
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group