cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Configuring airflow

Anonymous
Not applicable

Should we create a Databricks user for airflow and generate a personal access token for it? We also have gsuite SSO enabled, does that mean I need to create a gsuite account for the user as well?

1 ACCEPTED SOLUTION

Accepted Solutions

User16783855117
Contributor II

I would recommend having the 'user' the Databricks Jobs are triggered by as a dedicated user. This is what I would consider a 'Service Account' and I'll drop a definition for that type of user below.

Seeing that you have SSO enabled, I might create this user in the IDP system in gsuite, and propagate this newly created user into the Databricks workspace. I would ensure this user has the appropriate Job Create permissions, and then generate a PAT for integration with Airflow.

Having a dedicated account will help with monitoring its usage, and also ensure CI/CD best practices are observed for code promotion. This also ensures that the 'normal user' PAT is not in the loop, because if that 'normal user' were to leave the company, they could be de-provisioned from the Databricks workspace, which invalidates the PAT, which then breaks your Airflow integration.

Service accounts are a special type of non-human privileged account used to execute applications and run automated services, virtual machine instances, and other processes.

View solution in original post

1 REPLY 1

User16783855117
Contributor II

I would recommend having the 'user' the Databricks Jobs are triggered by as a dedicated user. This is what I would consider a 'Service Account' and I'll drop a definition for that type of user below.

Seeing that you have SSO enabled, I might create this user in the IDP system in gsuite, and propagate this newly created user into the Databricks workspace. I would ensure this user has the appropriate Job Create permissions, and then generate a PAT for integration with Airflow.

Having a dedicated account will help with monitoring its usage, and also ensure CI/CD best practices are observed for code promotion. This also ensures that the 'normal user' PAT is not in the loop, because if that 'normal user' were to leave the company, they could be de-provisioned from the Databricks workspace, which invalidates the PAT, which then breaks your Airflow integration.

Service accounts are a special type of non-human privileged account used to execute applications and run automated services, virtual machine instances, and other processes.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group