- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2024 02:04 AM
Hi Community
I made some research, but I wasn't lucky, and I'm a bit surprised I can't find anything about it.
So, I would simply access the job parameters when using python scripts (not notebooks).
My flow doesn't use notebooks, but I still need to drive some parameters that I want to declare before I run the job (so, static).
Here are my attempts so far:
- the trivial use of widgets does not work. Widgets are available only in notebooks, and thus dbutils.widgets.text/get are out of scope. The functions simply return None
- I tried to move to environment variables, set at runtime, in which a simple notebook, set as root of the flow, push the job parameter to the environment variables, i.e., os.environ["PARAM"] = dbutils.widgets.text().
Unfortunately, env variables are *not* propagated to children tasks (probably because interpreters are restarted at each task)
Limits to workarounds:
Of course there are 10K workarounds, but some are not applicable to my scope, some are really bad practice. I try to list here my limitations:
- setting environment variables at init scripts does NOT solve my problem. I want a parameter to change before running the job, so same cluster, same flow etc.
- I want to avoid to create as many job clusters as the parameters, and "pick" the relevant cluster every time i change parameters. Not a good practice in my view
- I have 20+ scripts, running as DAG flow. They are scripts because they are supposed to run also outside databricks and independently. So I want to avoid the conversion of such scripts to notebooks (this also brings some issues about versioning the code AND the vscode databricks plugin...other topic)
- I cannot use task parameters. Task parameters depends on the task values, and it doesn't make sense to load a parameter in all my scripts with an hardcoded task (something like dbutils.jobs.taskValues.get(taskKey = "environment_setter", key = "param", default = 42, debugValue = 0), where "environment_setter" is my root task... )
Any ideas is really appreciated!
Thanks
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-07-2024 08:57 AM
The only working workaround I found has been provided in another thread
Re: Retrieve job-level parameters in Python - Databricks Community - 44720
I will repost it here (thanks @julio_resende )
You need to push down your parameters to a task level. Eg:
- Create a job level parameter called "my_param"
- Make a reference to his job parameter in the task level parameters box. Eg:["--my_param","{{job.parameters.my_param}}"]
- Read the task level parameter using argparser in your .py file
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2024 04:54 AM
@N_M
From what I see in the documentation, spark_python_task takes a "parameters" as an array of strings, in which you can put your command line parameters using {{job.parameters.[name]}}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-30-2024 07:28 AM
This is the right answer, here is the doc Daniel is referring to: https://docs.databricks.com/en/workflows/jobs/parameter-value-references.html#pass-context-about-job...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-23-2024 11:21 AM
A workaround that I found is to use the databricks jobs api to get the job run info. There are job parameters inside, but you need to prepare crendetials in advance.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-07-2024 01:18 AM
@N_M, I have the same issue.
Have you found a solution to the problem?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-07-2024 01:35 AM
there're 2 solutions:
official one: Re: Retrieve job-level parameters in Python - Databricks Community - 44720
another one is to use the jobs rest api.
The official one is only available once you enter the argparse part (for some use cases, it might be too late). On the other hand, the rest api can be reaheable from anywhere.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-07-2024 07:56 AM
Thank you! It worked! Used the first (official) solution with asset bundles.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-07-2024 08:57 AM
The only working workaround I found has been provided in another thread
Re: Retrieve job-level parameters in Python - Databricks Community - 44720
I will repost it here (thanks @julio_resende )
You need to push down your parameters to a task level. Eg:
- Create a job level parameter called "my_param"
- Make a reference to his job parameter in the task level parameters box. Eg:["--my_param","{{job.parameters.my_param}}"]
- Read the task level parameter using argparser in your .py file

