06-21-2024 02:04 AM
Hi Community
I made some research, but I wasn't lucky, and I'm a bit surprised I can't find anything about it.
So, I would simply access the job parameters when using python scripts (not notebooks).
My flow doesn't use notebooks, but I still need to drive some parameters that I want to declare before I run the job (so, static).
Here are my attempts so far:
Of course there are 10K workarounds, but some are not applicable to my scope, some are really bad practice. I try to list here my limitations:
Thanks
08-07-2024 08:57 AM
The only working workaround I found has been provided in another thread
Re: Retrieve job-level parameters in Python - Databricks Community - 44720
I will repost it here (thanks @julio_resende )
You need to push down your parameters to a task level. Eg:
06-21-2024 04:54 AM
@N_M
From what I see in the documentation, spark_python_task takes a "parameters" as an array of strings, in which you can put your command line parameters using {{job.parameters.[name]}}
07-30-2024 07:28 AM
This is the right answer, here is the doc Daniel is referring to: https://docs.databricks.com/en/workflows/jobs/parameter-value-references.html#pass-context-about-job...
06-23-2024 11:21 AM
A workaround that I found is to use the databricks jobs api to get the job run info. There are job parameters inside, but you need to prepare crendetials in advance.
08-07-2024 01:18 AM
@N_M, I have the same issue.
Have you found a solution to the problem?
08-07-2024 01:35 AM
there're 2 solutions:
official one: Re: Retrieve job-level parameters in Python - Databricks Community - 44720
another one is to use the jobs rest api.
The official one is only available once you enter the argparse part (for some use cases, it might be too late). On the other hand, the rest api can be reaheable from anywhere.
08-07-2024 07:56 AM
Thank you! It worked! Used the first (official) solution with asset bundles.
08-07-2024 08:57 AM
The only working workaround I found has been provided in another thread
Re: Retrieve job-level parameters in Python - Databricks Community - 44720
I will repost it here (thanks @julio_resende )
You need to push down your parameters to a task level. Eg:
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group