Hi Databricks, we have created several Databricks workflows and the `json-definition.json` for the same is stored inside version control i.e. GitHub. There are several parameters which are referred from params.json inside this job definition but the issue is the params.json is hardcoded as of current design.
We are trying to figure out a way to pass config parameters to the Controller workflow without using hotfix branch / Databricks UI with some approvals/governance in place. Is there a more better way of doing this so that we don't have to dig inside the params.json to update/modify the existing keys/values.
I am adding the basic tree structure for my MLOps repo:
.
├── README.md
├── databricks_notebook_jobs
│ ├── CONTROLLER_JOB_CLUSTER
│ │ ├── YOUR_FIRST_NOTEBOOK_NAME.py
│ │ ├── job-definition.json
│ │ └── spark_env_vars
│ │ ├── dev.json
│ │ ├── expl.json
│ │ ├── prod.json
│ │ └── qa.json
│ ├── all_cluster_init.sh
│ ├── default_job_config.json
│ ├── example_module.py
│ └── release.json
├── devops
│ └── params.json
├── requirements.txt
│ └── utils
│ ├── Config.py
│ ├── Config_old.py
│ └── setenv.py
├── test_requirements.txt
├── tests
│ └── test_cleaning_utils.py
└── version.py
Please suggest.