cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

use job parameters in scripts

N_M
Contributor

Hi Community

I made some research, but I wasn't lucky, and I'm a bit surprised I can't find anything about it.

So, I would simply access the job parameters when using python scripts (not notebooks).

My flow doesn't use notebooks, but I still need to drive some parameters that I want to declare before I run the job (so, static).

Here are my attempts so far:

  • the trivial use of widgets does not work. Widgets are available only in notebooks, and thus dbutils.widgets.text/get are out of scope. The functions simply return None
  • I tried to move to environment variables, set at runtime, in which a simple notebook, set as root of the flow, push the job parameter to the environment variables, i.e., os.environ["PARAM"] = dbutils.widgets.text().
    Unfortunately, env variables are *not* propagated to children tasks (probably because interpreters are restarted at each task)
    Limits to workarounds:

Of course there are 10K workarounds, but some are not applicable to my scope, some are really bad practice. I try to list here my limitations:

  • setting environment variables at init scripts does NOT solve my problem. I want a parameter to change before running the job, so same cluster, same flow etc.
  • I want to avoid to create as many job clusters as the parameters, and "pick" the relevant cluster every time i change parameters. Not a good practice in my view
  • I have 20+ scripts, running as DAG flow. They are scripts because they are supposed to run also outside databricks and independently. So I want to avoid the conversion of such scripts to notebooks (this also brings some issues about versioning the code AND the vscode databricks plugin...other topic)
  • I cannot use task parameters. Task parameters depends on the task values, and it doesn't make sense to load a parameter in all my scripts with an hardcoded task (something like dbutils.jobs.taskValues.get(taskKey = "environment_setter", key = "param", default = 42, debugValue = 0), where "environment_setter" is my root task... )
    Any ideas is really appreciated!

Thanks

1 ACCEPTED SOLUTION

Accepted Solutions

N_M
Contributor

The only working workaround I found has been provided in another thread
Re: Retrieve job-level parameters in Python - Databricks Community - 44720

I will repost it here (thanks @julio_resende )

You need to push down your parameters to a task level. Eg:

  1. Create a job level parameter called "my_param"
  2. Make a reference to his job parameter in the task level parameters box. Eg:
    ["--my_param","{{job.parameters.my_param}}"]
  3. Read the task level parameter using argparser in your .py file

View solution in original post

7 REPLIES 7

daniel_sahal
Esteemed Contributor

@N_M 
From what I see in the documentation, spark_python_task takes a "parameters" as an array of strings, in which you can put your command line parameters using {{job.parameters.[name]}}

This is the right answer, here is the doc Daniel is referring to: https://docs.databricks.com/en/workflows/jobs/parameter-value-references.html#pass-context-about-job...

xiangzhu
Contributor III

A workaround that I found is to use the databricks jobs api to get the job run info. There are job parameters inside, but you need to prepare crendetials in advance.

jensi
New Contributor II

@N_M, I have the same issue.

Have you found a solution to the problem?

xiangzhu
Contributor III

there're 2 solutions:

official one: Re: Retrieve job-level parameters in Python - Databricks Community - 44720

another one is to use the jobs rest api.

The official one is only available once you enter the argparse part (for some use cases, it might be too late). On the other hand, the rest api can be reaheable from anywhere.

jensi
New Contributor II

Thank you! It worked! Used the first (official) solution with asset bundles.

N_M
Contributor

The only working workaround I found has been provided in another thread
Re: Retrieve job-level parameters in Python - Databricks Community - 44720

I will repost it here (thanks @julio_resende )

You need to push down your parameters to a task level. Eg:

  1. Create a job level parameter called "my_param"
  2. Make a reference to his job parameter in the task level parameters box. Eg:
    ["--my_param","{{job.parameters.my_param}}"]
  3. Read the task level parameter using argparser in your .py file

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group