10-02-2023 07:30 AM
Hi everyone,
It's relatively straight forward to pass a value to a key-value pair in notebook job. But for the python file job however, I couldn't figure out how to do it. Does anyone have any idea?
Have been tried out different variations for a job with python file like below for example. Theoretically, I think we can add a notebook task initially in the job flow to work around. But it would be great to know a straightforward way to do this
Thank you for your time
10-02-2023 02:23 PM
You don't need to use dbutils for this type of parameters. You can get the arguments using sys.argv. This is an example using a python wheel, but you can use it as reference: https://docs.databricks.com/en/workflows/jobs/how-to/use-python-wheels-in-workflows.html
"""
The entry point of the Python Wheel
"""
import sys
def main():
# This method will print the provided arguments
print('Hello from my func')
print('Got arguments:')
print(sys.argv)
if __name__ == '__main__':
main()
10-02-2023 02:23 PM
You don't need to use dbutils for this type of parameters. You can get the arguments using sys.argv. This is an example using a python wheel, but you can use it as reference: https://docs.databricks.com/en/workflows/jobs/how-to/use-python-wheels-in-workflows.html
"""
The entry point of the Python Wheel
"""
import sys
def main():
# This method will print the provided arguments
print('Hello from my func')
print('Got arguments:')
print(sys.argv)
if __name__ == '__main__':
main()
10-04-2023 10:06 AM
ah, thank you, completely forgot about that 🙂
05-08-2024 12:37 AM
Thanks so much for this! By the way, is there a way to do it with the JSON interface? I am struggling to get the parameters if entered in this way 😕
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group