โ12-10-2024 11:26 AM
Hi,
I have Databricks on top of aws. I have a Databricks connection on Airflow (mwaa). I am able to conect and execute a Datbricks job via Airflow using a personal access token. I believe the best practice is to conect using a service principal. I understand that I should use the connection id and the secret in order to conect but I get error 401 which I believe it is a result of incorrect Oauth M2M.
Can someone share a light on how it should be done?
Thanks.
โ12-10-2024 11:43 AM - edited โ12-10-2024 11:43 AM
Hi @T_I ,
Have you followed this guide? https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html
You will have to use these 2 parameters from the service principal:
โ12-11-2024 12:41 AM
I have followed this guide but I do not understand how to implement step 4 - OAuth M2M authentication...
If you could assist, it would be greatly appreciated.
โ12-11-2024 05:29 AM
Hi @T_I,
Instead of the PAT token you have to specify the below settings to be able to use the Service Principal:
For workspace-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the Databricks workspace URL, for example https://dbc-a1b2345c-d6e7.cloud.databricks.com
.
DATABRICKS_CLIENT_ID
DATABRICKS_CLIENT_SECRET
โ12-11-2024 05:36 AM
Where should I specify those settings?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group