I would recommend having the 'user' the Databricks Jobs are triggered by as a dedicated user. This is what I would consider a 'Service Account' and I'll drop a definition for that type of user below.
Seeing that you have SSO enabled, I might create this user in the IDP system in gsuite, and propagate this newly created user into the Databricks workspace. I would ensure this user has the appropriate Job Create permissions, and then generate a PAT for integration with Airflow.
Having a dedicated account will help with monitoring its usage, and also ensure CI/CD best practices are observed for code promotion. This also ensures that the 'normal user' PAT is not in the loop, because if that 'normal user' were to leave the company, they could be de-provisioned from the Databricks workspace, which invalidates the PAT, which then breaks your Airflow integration.
Service accounts are a special type of non-human privileged account used to execute applications and run automated services, virtual machine instances, and other processes.