I would suggest to create .databrickscfg profile ,best practice and most easy
You will need to create it in your ~ directory vim ~/.databrickscfg
inside the file you can define multiple profiles like this
https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-sp
# For account level access
[<some-unique-configuration-profile-name>]
host = <account-console-url>
account_id = <account-id>
azure_tenant_id = <azure-service-principal-tenant-id>
azure_client_id = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>
# For workspace level access
[<some-unique-configuration-profile-name>]
host = <workspace-url>
azure_tenant_id = <azure-service-principal-tenant-id>
azure_client_id = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>
After filling all these values you can test once whether your configurations are correctly set or not with this command
'databricks configure' all you profiles mentioned in your .databrickscfg will be checked whether their configuration is correct or not.
And then you can run any databricks cli command along with this global argument -p or --profile <profile-name>
Like databricks bundle deploy -p <profile-name>.
By default your can set your .databrickscfg file is set to ~ path but you can also specify it in any other directory for that you can set an environment variable. More info on it here : https://docs.databricks.com/aws/en/dev-tools/auth/unified-auth#config-profiles