Your are welcome. There was a feature that databricks released to linked the workflow definition to the GIT automatically. Please refer the link below,
https://www.databricks.com/blog/2022/06/21/build-reliable-production-data-and-ml-pipelines-with-git-...
In terms of the flow, below are the steps you should be following.
Step 1: Link workflow with the GIT repository with the databricks feature. If that feature is still not available, you can have a very simple python or a shell script (make only one API call) installed in their local using this documentation https://docs.databricks.com/workflows/jobs/jobs-2.0-api.html (This is the method we follow since we implemented git integration even before databricks added the feature.
Step 2: Developer checks-in the code. Peer approves the the PR.
Step 3: Devops pipeline can pick the workflow config from the git folder, either 1) perform a replace in the azure devops pipeline using a string replace and deploy the databricks workflow in your higher enviornment using the task or an API call. 2) Terraform task can also be used to deploy a workflow which may be little bit easier than option 1.
Hope that helps.