Hello @Naren1 ,
Yes — you can pass parameters from ADF to a Databricks Job run, but you generally can’t use those parameters to change the job cluster configuration (node type, Spark version, autoscale, init scripts, etc.) for that run.
In an ADF Databricks Job activity, the supported runtime customization is jobParameters (key-value pairs) that get passed into the job run. Doc: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-databricks-job#databricks-job-ac.... With the Job activity, ADF would run an existing Databricks jobId, optionally with jobParameters.
Would you please help me understand what do you mean by environment -- different libraries? different Spark version? different node size?
Anudeep