โ06-21-2024 03:43 AM
Hello,
I would like to use job parameters in spark_python_task (not notebook_task), does anyone know how to retrieve these parameters inside pure Python ?
I tried:
1/ dbutils.widgets.get("debug"), got error:
com.databricks.dbutils_v1.InputWidgetNotDefined: No input widget named debug is defined
2/ use argparse to define a debug param, but after test, debug is not passed as param:
parser: (Namespace(debug=None), [])
My job has debug param:
โ06-21-2024 04:54 AM
@xiangzhu It looks like it's the same as in https://community.databricks.com/t5/data-engineering/use-job-parameters-in-scripts/td-p/75296
โ06-21-2024 04:54 AM
@xiangzhu It looks like it's the same as in https://community.databricks.com/t5/data-engineering/use-job-parameters-in-scripts/td-p/75296
โ06-23-2024 11:12 AM
Thanks for your reply. Yes, I have the same question as the thread you mentioned, but argparse doesn't work for job level parameters; it only works for task level parameters.
The Databricks doc is minimal at this level and only says that job level parameters will be propagated to tasks. Nothing more.
โ06-23-2024 11:18 AM
During this weekend, I found a workaround by leveraging the databaricks jobs api.
It works but I must provide credentials, whereas I'm looking forward to a solution without credentials since I'm already in a job run.
โ06-23-2024 11:24 AM
I rechecked your answner in another thread and sorry I haven't seen `{{job.parameters.[name]}}`, I will try it in the coming days.
โ06-24-2024 07:08 AM
@daniel_sahal just tested {{job.parameters.[name]}}, it works, thanks again !
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group