cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue with Creating External Location Using Service Principal in Terraform

jv_v
Contributor

I'm facing an issue while trying to create an external location in Databricks using Terraform and a service principal. The specific error message I'm encountering is:

Error:

jv_v_0-1719497498809.png

Here's some context:

  • Using Azure CLI (Az login): The creation of the external location works without any issues when I authenticate using Az login.
  • Using Service Principal: The error occurs when I switch to using a service principal for authentication in my Terraform code.

Here is a snippet of my Terraform provider configuration:

terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
}
databricks = {
source = "databricks/databricks"
version = "1.46.0"

}
}
}

provider "azurerm" {
skip_provider_registration="true"
features {}
subscription_id = var.subscription_id
client_id = var.client_id
client_secret = var.client_secret
tenant_id = var.tenant_id
}

// Provider for databricks account
provider "databricks" {
alias = "azure_account"
host = "https://accounts.azuredatabricks.net"
account_id = var.account_id
client_id = var.client_id
client_secret = var.db_client_secret
}

// Provider for databricks workspace
provider "databricks" {
alias = "Workspace"
host = local.databricks_workspace_host
client_id = var.client_id
client_secret = var.db_client_secret
}

resource "databricks_storage_credential" "external_mi" {
provider = databricks.Workspace
name = var.storage_credential_name
azure_managed_identity {
access_connector_id = module.metastore_and_users.azurerm_databricks_access_connector_id
}
owner = var.owner
comment = "Storage credential for all external locations"
depends_on = [module.metastore_and_users.databricks_metastore_assignment]

}
output "storage_credential_result" {
value = {
storage1 = databricks_storage_credential.external_mi.name
storage2 = databricks_storage_credential.external_mi.owner
storage3 = databricks_storage_credential.external_mi.azure_managed_identity
}
}

// Task011 Create external location to be used as root storage by dev catalog
resource "databricks_external_location" "dev_location" {
provider = databricks.Workspace
name = var.external_location_name
url = format("abfss://%s@%s.dfs.core.windows.net/",azurerm_storage_container.dev_catalog.name,
module.metastore_and_users.azurerm_storage_account_unity_catalog.name)
credential_name = databricks_storage_credential.external_mi.id
owner = var.owner
comment = "External location used by dev catalog as root storage"
}

Can anyone provide guidance on:

  1. The correct way to grant the CREATE EXTERNAL LOCATION permission to a service principal in Databricks?
  2. Any additional roles or permissions the service principal might need to successfully create an external location?
  3. Any potential misconfigurations in my Terraform setup that could be causing this issue?

Thanks in advance for your help!

3 REPLIES 3

jacovangelder
Honored Contributor

There is only one reason for this. The service principal that you use does not have the right grants set on the Metastore level.

Are you using the same service principal in your az-login? Because then it is very strange. 

Have you created the Metastore with Terraform as well? or Manually?
If manually, you'll have to manually grant the service principal you're using for deploying resources the right priviliges (grants). 

If not manually, you can use databricks_grants to grant CREATE_EXTERNAL_LOCATION (and/or others) priviliges to the service principal

Yes, we are using same service principle in AZ login.
Az login Provider Configuration:
// Provider for databricks account
provider "databricks" {
alias = "azure_account"
host = "https://accounts.azuredatabricks.net"
account_id = var.account_id
auth_type = "azure-cli"
}
// Provider for databricks workspace
provider "databricks" {
host = local.databricks_workspace_host
}
We are able to create metastore with Terraform using same service principle and below provider configuration.

// Provider for databricks account
provider "databricks" {
  alias      = "azure_account"
  account_id = var.account_id
  client_id = var.client_id
  client_secret = var.db_client_secret
}

// Provider for databricks workspace
provider "databricks" {
  alias = "Workspace"
  host = local.databricks_workspace_host
  client_id = var.client_id
  client_secret = var.db_client_secret
}
 
However facing same issue with below terraform block.
resource "databricks_metastore_data_access" "first" {
  provider = databricks.azure_account
  metastore_id = databricks_metastore.this.id
  name         = "the-metastore-key"
  azure_managed_identity {
    access_connector_id = azurerm_databricks_access_connector.unity.id
  }
  is_default = true
  depends_on = [databricks_metastore_assignment.this]
}
Is there any link or dependency between databricks_metastore_data_access and databricks_external_location?
Thanks in advance for your help!

jacovangelder
Honored Contributor

After creating the databricks_metastore resource, can you run databricks_grants? like this 

 

resource "databricks_grants" "foo" {
depends_on = databricks_metastore.foo
  metastore = databricks_metastore.foo.id
  grant {
    principal  = <your service principal>
    privileges =  ["CREATE_EXTERNAL_LOCATION", <other priviliges>]
  }
}

 

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group