hello @azam-io ,
from what I know variables need to be defined in the databricks.yml file (never tried otherwise to be fair). Since you still want your variables to be environment dependent, I believe there are a few options.
One could be using dotenv files, or pointing at some other configurations (maybe in volumes) where you can store the parameters and you read the file in your job.
Or keeping the structure you envision: define all the variables for all your jobs, maybe you can leverage complex variables:
variables:
job_x_params:
description: 'My job params'
type: complex
default:
param1: 'value1'
param2: 'value2'
param3:
param3.1: true
param3.2: false
Then you can store a variable-overrides.json file for each environment. There's an example of this implementation in this other thread: Solved: Re: How to use variable-overrides.json for environ... - Databricks Community - 125126
In my view I think it can be quite hard to manage if the number of jobs and parameters increases... Probably storing job parameters in configuration files would be cleaner, and the asset bundles variables can be the path to those files.
Hope this could help, otherwise maybe could you share some parameters example?