โ09-22-2024 07:27 PM
Hi all,
When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it?
Error: cannot create storage credential: failed during request visitor: default auth: azure-cli: cannot get account info: exit status 1.
My implementation.
(1) Below is my `main.tf`. It works well on my local using my own Azure account.
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "4.2.0"
}
databricks = {
source = "databricks/databricks"
version = "1.52.0"
}
}
}
provider "azurerm" {
features {}
subscription_id = "${var.AZURE_SUBSCRIPTION_ID}"
}
provider "databricks" {
host = var.DATABRICKS_HOST
retry_timeout_seconds = 600
}
data "azurerm_databricks_access_connector" "unity_catalog_access_connector" {
name = "unity-catalog-access-connector"
resource_group_name = "rg-dbr-managed-${var.ENVIRONMENT}"
}
resource "databricks_storage_credential" "dbr_strg_cred" {
name = "dbr_strg_cred_${var.ENVIRONMENT}"
azure_managed_identity {
access_connector_id = data.azurerm_databricks_access_connector.unity_catalog_access_connector.id
}
}
(2) However, I got an error when I apply the Terraform file in an Azure DevOps pipeline. Below is my `azure-pipelines.yaml`.
trigger: none
pool:
vmImage: ubuntu-latest
variables:
- group: vg-dbr-dev
stages:
- stage: setup
jobs:
- job: setup
displayName: "Set up Databricks workspace using Terraform"
steps:
- script: |
echo "##vso[task.setvariable variable=TF_VAR_DATABRICKS_HOST]$DATABRICKS_HOST"
echo "##vso[task.setvariable variable=TF_VAR_AZURE_SUBSCRIPTION_ID]$AZURE_SUBSCRIPTION_ID"
echo "##vso[task.setvariable variable=TF_VAR_ENVIRONMENT]$ENV"
displayName: 'Set up environment variables'
- script: env | sort
displayName: 'Environment / Context'
- task: UsePythonVersion@0
displayName: 'Use Python 3.12'
inputs:
versionSpec: 3.12
- script: |
python -m pip install wheel
displayName: 'Install dependencies'
- task: TerraformInstaller@1
displayName: Install Terraform 1.9.2
inputs:
terraformVersion: 1.9.2
- task: TerraformTaskV4@4
displayName: Initialize Terraform
inputs:
provider: 'azurerm'
command: 'init'
backendServiceArm: '$(DEVOPS_SERVEICE_CONNECTION)'
backendAzureRmResourceGroupName: '$(TERRAFORM_STATE_STORAGE_RESOURCE_GROUP_NAME)'
backendAzureRmStorageAccountName: '$(TERRAFORM_STATE_STORAGE_ACCOUNT_NAME)'
backendAzureRmContainerName: '$(TERRAFORM_STATE_STORAGE_CONTAINER_NAME)'
backendAzureRmKey: 'state.tfstate'
- task: TerraformTaskV4@4
name: terraformPlan
displayName: Create Terraform Plan
inputs:
provider: 'azurerm'
command: 'plan'
commandOptions: '-out main.tfplan'
environmentServiceNameAzureRM: '$(DEVOPS_SERVEICE_CONNECTION)'
# Only runs if the 'terraformPlan' task has detected changes the in state.
- task: TerraformTaskV4@4
displayName: Apply Terraform Plan
condition: eq(variables['terraformPlan.changesPresent'], 'true')
inputs:
provider: 'azurerm'
command: 'apply'
commandOptions: 'main.tfplan'
environmentServiceNameAzureRM: '$(DEVOPS_SERVEICE_CONNECTION)'
(3) In the Azure DevOps pipeline, I use a DevOps Service Connection, which refers to a Microsoft Entra ID app (a service principal). I have added this service principal to the Databricks account and the Databricks workspace, and give the service principal the account admin and workspace admin permissions.
However, the pipeline reports error in the last step.
/opt/hostedtoolcache/terraform/1.9.2/x64/terraform apply -auto-approve main.tfplan
databricks_storage_credential.dbr_strg_cred: Creating...
โท
โ Error: cannot create storage credential: failed during request visitor: default auth: azure-cli: cannot get account info: exit status 1. Config: host=https://adb-123123123123123.1.azuredatabricks.net, azure_client_id=***, azure_tenant_id=d123123-1234-1234-1234-123123123. Env: DATABRICKS_HOST, ARM_CLIENT_ID, ARM_TENANT_ID
โ
โ with databricks_storage_credential.dbr_strg_cred,
โ on main.tf line 33, in resource "databricks_storage_credential" "dbr_strg_cred":
โ 33: resource "databricks_storage_credential" "dbr_strg_cred" {
โ
โต
##[error]Error: The process '/opt/hostedtoolcache/terraform/1.9.2/x64/terraform' failed with exit code 1
Does anybody have any idea? Thank you.
โ09-24-2024 01:41 PM
I found the reason. Because I did not configure the `auth_type` for the Terraform Databricks provider, it uses the default auth type `azure-cli`. However, in my pipeline, I did not log in Azure CLI using `az login`. Therefore, the authentication of the Terraform Databricks provider does not work.
โ09-23-2024 05:26 PM
Still struggling with this issue. Can anyone kindly help?
โ09-24-2024 01:41 PM
I found the reason. Because I did not configure the `auth_type` for the Terraform Databricks provider, it uses the default auth type `azure-cli`. However, in my pipeline, I did not log in Azure CLI using `az login`. Therefore, the authentication of the Terraform Databricks provider does not work.
3 weeks ago
How exactly do you need to configure auth_type in this case? I tried different options but nothing seems to work. I also would like to use the Service Connection from Azure DevOps Pipeline to deploy Databricks via TerraformTaskV4@4.
3 weeks ago
I used the following configure in `main.tf`.
provider "databricks" {
auth_type = "pat"
}
Then, in my Azure Pipeline, I configure the following:
......
- task: AzureCLI@2
displayName: 'Get Databricks access token'
inputs:
azureSubscription: $(DEVOPS_SERVEICE_CONNECTION)
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
echo "Getting access token..."
DATABRICKS_TOKEN=$(az account get-access-token --resource 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d --query "accessToken" -o tsv)
echo "##vso[task.setvariable variable=DATABRICKS_TOKEN]$DATABRICKS_TOKEN"
......
- task: TerraformTaskV4@4
name: terraformPlan
displayName: Create Terraform Plan
inputs:
provider: 'azurerm'
command: 'plan'
commandOptions: '-out main.tfplan'
environmentServiceNameAzureRM: '$(DEVOPS_SERVEICE_CONNECTION)'
......
I hope this is helpful.
3 weeks ago
Thank you for your reply! Where is $DATABRICKS_TOKEN then used in the terraform template?
3 weeks ago
echo "##vso[task.setvariable variable=DATABRICKS_TOKEN]$DATABRICKS_TOKEN"
The DATABRICKS_TOKEN is set as an environment variable. As mentioned, https://docs.databricks.com/aws/en/dev-tools/auth/pat and databricks_token | Resources | databricks/databricks | Terraform | Terraform Registry. The DATABRICKS_TOKEN will be used by Terraform for auth because
auth_type = "pat"
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group