@ckunal_eng -
One single Databricks Job run cannot dynamically change its "Run As" identity during execution. Rather you will need a pattern that separates the triggering identity from the executing identity.
I would pre-configure 4 dependent jobs with their respective Service Principals (SPs) and using the Master Job as a Trigger/dispatcher.
Master job's SP to have CAN_MANAGE_RUN permission on the above 4 jobs. This job could run a Python notebook that iterates through your dictionary and triggers the relevant Job via the Databricks Jobs API.
RG #Driving Business Outcomes with Data Intelligence