Hi @seefoods ,
In your asset bundle YAML, define the parameters using the named_parameters field, for example like this:
tasks:
- task_key: python_task
python_wheel_task:
entry_point: main
named_parameters:
input_path: "/data/input"
output_table: "results"
debug_mode: "true"
Then, in your Python script, use argparse to read those parameters:
import argparse
def main():
parser = argparse.ArgumentParser()
parser.add_argument("--input_path", required=True)
parser.add_argument("--output_table", required=True)
parser.add_argument("--debug_mode", default="false")
args = parser.parse_args()
When the job runs, Databricks automatically passes the values as command-line arguments, and argparse picks them up in your script.