cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure Databricks databricks-cli authentication with M2M using environment variables

Chris2794
New Contributor

Which environment variables do I have to set to use the databricks-cli with m2m oauth using Microsoft Entra ID managed service principals? I already added the service principal to the workspace.

I found the following documentation, but I am still confused why it does not work:

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-sp

I set the env vars DATABRICKS_HOST, ARM_TENANT_ID, ARM_CLIENT_ID and ARM_CLIENT_SECRET, but when I try to run some command, I still get the error:
Error: InvalidConfigurationError: You haven't configured the CLI yet! Please configure by entering `/home/airflow/.local/bin/databricks configure`

I will use env vars instead of config files as it should be part of an airflow CI/CD pipeline

1 REPLY 1

ashraf1395
Valued Contributor III

 I would suggest to create .databrickscfg profile ,best practice and most easy

You will need to create it in your ~ directory vim ~/.databrickscfg
inside the file you can define multiple profiles like this 
https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-sp

# For account level access 
[<some-unique-configuration-profile-name>]
host                = <account-console-url>
account_id          = <account-id>
azure_tenant_id     = <azure-service-principal-tenant-id>
azure_client_id     = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>

# For workspace level access
[<some-unique-configuration-profile-name>]
host                = <workspace-url>
azure_tenant_id     = <azure-service-principal-tenant-id>
azure_client_id     = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>

After filling all these values you can test once whether your configurations are correctly set or not with this command 
'databricks configure'  all you profiles mentioned in your .databrickscfg will be checked whether their configuration is correct or not. 

And then you can run any databricks cli command along with this global argument -p  or --profile <profile-name>

Like databricks bundle deploy -p <profile-name>.

 

By default your can set your .databrickscfg file is set to ~ path but you can also specify it in any other directory for that you can set an environment variable. More info on it here : https://docs.databricks.com/aws/en/dev-tools/auth/unified-auth#config-profiles

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group