cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Cannot use Terraform to create Databricks Storage Credential

AlbertWang
Contributor III

Hi all,

When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it?

 

Error: cannot create storage credential: failed during request visitor: default auth: azure-cli: cannot get account info: exit status 1.

 

My implementation.

(1) Below is my `main.tf`. It works well on my local using my own Azure account.

 

terraform {
  required_providers {
    azurerm = {
      source = "hashicorp/azurerm"
      version = "4.2.0"
    }
    databricks = {
      source = "databricks/databricks"
      version = "1.52.0"
    }
  }
}

provider "azurerm" {
    features {}
    subscription_id = "${var.AZURE_SUBSCRIPTION_ID}"
}

provider "databricks" {
  host  = var.DATABRICKS_HOST
  retry_timeout_seconds = 600
}

data "azurerm_databricks_access_connector" "unity_catalog_access_connector" {
  name                = "unity-catalog-access-connector"
  resource_group_name = "rg-dbr-managed-${var.ENVIRONMENT}"
}

resource "databricks_storage_credential" "dbr_strg_cred" {
  name = "dbr_strg_cred_${var.ENVIRONMENT}"
  azure_managed_identity {
    access_connector_id = data.azurerm_databricks_access_connector.unity_catalog_access_connector.id
  }
}

 

(2) However, I got an error when I apply the Terraform file in an Azure DevOps pipeline. Below is my `azure-pipelines.yaml`.

 

trigger: none

pool:
  vmImage: ubuntu-latest

variables:
  - group: vg-dbr-dev

stages:
- stage: setup

  jobs:
  - job: setup
    displayName: "Set up Databricks workspace using Terraform"
    steps:

    - script: |
        echo "##vso[task.setvariable variable=TF_VAR_DATABRICKS_HOST]$DATABRICKS_HOST"
        echo "##vso[task.setvariable variable=TF_VAR_AZURE_SUBSCRIPTION_ID]$AZURE_SUBSCRIPTION_ID"
        echo "##vso[task.setvariable variable=TF_VAR_ENVIRONMENT]$ENV"
      displayName: 'Set up environment variables'

    - script: env | sort
      displayName: 'Environment / Context'

    - task: UsePythonVersion@0
      displayName: 'Use Python 3.12'
      inputs:
        versionSpec: 3.12

    - script: |
        python -m pip install wheel
      displayName: 'Install dependencies' 

    - task: TerraformInstaller@1
      displayName: Install Terraform 1.9.2
      inputs:
        terraformVersion: 1.9.2

    - task: TerraformTaskV4@4
      displayName: Initialize Terraform
      inputs:
        provider: 'azurerm'
        command: 'init'
        backendServiceArm: '$(DEVOPS_SERVEICE_CONNECTION)'
        backendAzureRmResourceGroupName: '$(TERRAFORM_STATE_STORAGE_RESOURCE_GROUP_NAME)'
        backendAzureRmStorageAccountName: '$(TERRAFORM_STATE_STORAGE_ACCOUNT_NAME)'
        backendAzureRmContainerName: '$(TERRAFORM_STATE_STORAGE_CONTAINER_NAME)'
        backendAzureRmKey: 'state.tfstate'

    - task: TerraformTaskV4@4
      name: terraformPlan
      displayName: Create Terraform Plan
      inputs:
        provider: 'azurerm'
        command: 'plan'
        commandOptions: '-out main.tfplan'
        environmentServiceNameAzureRM: '$(DEVOPS_SERVEICE_CONNECTION)'

    # Only runs if the 'terraformPlan' task has detected changes the in state. 
    - task: TerraformTaskV4@4
      displayName: Apply Terraform Plan
      condition: eq(variables['terraformPlan.changesPresent'], 'true')
      inputs:
        provider: 'azurerm'
        command: 'apply'
        commandOptions: 'main.tfplan'
        environmentServiceNameAzureRM: '$(DEVOPS_SERVEICE_CONNECTION)'

 

(3) In the Azure DevOps pipeline, I use a DevOps Service Connection, which refers to a Microsoft Entra ID app (a service principal). I have added this service principal to the Databricks account and the Databricks workspace, and give the service principal the account admin and workspace admin permissions.

However, the pipeline reports error in the last step.

 

/opt/hostedtoolcache/terraform/1.9.2/x64/terraform apply -auto-approve main.tfplan
databricks_storage_credential.dbr_strg_cred: Creating...
ā•·
ā”‚ Error: cannot create storage credential: failed during request visitor: default auth: azure-cli: cannot get account info: exit status 1. Config: host=https://adb-123123123123123.1.azuredatabricks.net, azure_client_id=***, azure_tenant_id=d123123-1234-1234-1234-123123123. Env: DATABRICKS_HOST, ARM_CLIENT_ID, ARM_TENANT_ID
ā”‚ 
ā”‚   with databricks_storage_credential.dbr_strg_cred,
ā”‚   on main.tf line 33, in resource "databricks_storage_credential" "dbr_strg_cred":
ā”‚   33: resource "databricks_storage_credential" "dbr_strg_cred" {
ā”‚ 
ā•µ

##[error]Error: The process '/opt/hostedtoolcache/terraform/1.9.2/x64/terraform' failed with exit code 1

 

Does anybody have any idea? Thank you.

1 ACCEPTED SOLUTION

Accepted Solutions

AlbertWang
Contributor III

I found the reason. Because I did not configure the `auth_type` for the Terraform Databricks provider, it uses the default auth type `azure-cli`. However, in my pipeline, I did not log in Azure CLI using `az login`. Therefore, the authentication of the Terraform Databricks provider does not work.

View solution in original post

2 REPLIES 2

AlbertWang
Contributor III

Still struggling with this issue. Can anyone kindly help?

AlbertWang
Contributor III

I found the reason. Because I did not configure the `auth_type` for the Terraform Databricks provider, it uses the default auth type `azure-cli`. However, in my pipeline, I did not log in Azure CLI using `az login`. Therefore, the authentication of the Terraform Databricks provider does not work.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonā€™t want to miss the chance to attend and share knowledge.

If there isnā€™t a group near you, start one and help create a community that brings people together.

Request a New Group