Hi Databricks, we have created several Databricks workflows and the `json-definition.json` for the same is stored inside version control i.e. GitHub. There are several parameters which are referred from params.json inside this job definition but the issue is the params.json is hardcoded as of current design.
We are trying to figure out a way to pass config parameters to the Controller workflow without using hotfix branch / Databricks UI with some approvals/governance in place. Is there a more better way of doing this so that we don't have to dig inside the params.json to update/modify the existing keys/values.
I am adding the basic tree structure for my MLOps repo:
.
โโโ README.md
โโโ databricks_notebook_jobs
โ โโโ CONTROLLER_JOB_CLUSTER
โ โ โโโ YOUR_FIRST_NOTEBOOK_NAME.py
โ โ โโโ job-definition.json
โ โ โโโ spark_env_vars
โ โ โโโ dev.json
โ โ โโโ expl.json
โ โ โโโ prod.json
โ โ โโโ qa.json
โ โโโ all_cluster_init.sh
โ โโโ default_job_config.json
โ โโโ example_module.py
โ โโโ release.json
โโโ devops
โ โโโ params.json
โโโ requirements.txt
โ โโโ utils
โ โโโ Config.py
โ โโโ Config_old.py
โ โโโ setenv.py
โโโ test_requirements.txt
โโโ tests
โ โโโ test_cleaning_utils.py
โโโ version.py
Please suggest.