- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2021 08:45 AM
I want to use databricks cli:
databricks clusters list
but this requires a manual step that requires interactive work with the user:
databricks configure --token
Is there a way to use databricks cli without manual intervention so that you can run it as part of a ci/cd pipeline?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-25-2021 10:38 AM
You can set two environment variables: DATABRICKS_HOST and DATABRICKS_TOKEN, and databricks-cli will use them. See the example of that in the DevOps pipeline
see the full list of environment variables at the end of the Authentication section of documentation
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2021 10:15 AM
You can try configure the credential by editing file ~/.databrickscfg .
The file content will be like below:
[DEFAULT]
host = [workspace url]
username = [email id]
password = [password]
[profile 1]
host = [workspace url]
token = {personal access token}
Once you have the file in place, you do not need to run `databricks configure --token ` . The databricks cli will automatically pick up the credentials from the file.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-25-2021 10:38 AM
You can set two environment variables: DATABRICKS_HOST and DATABRICKS_TOKEN, and databricks-cli will use them. See the example of that in the DevOps pipeline
see the full list of environment variables at the end of the Authentication section of documentation
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
And which environment variables do I have to set to use the cli with m2m oauth using Microsoft Entra ID managed service principals?

