Issue with Creating External Location Using Service Principal in Terraform
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-27-2024 07:24 AM
I'm facing an issue while trying to create an external location in Databricks using Terraform and a service principal. The specific error message I'm encountering is:
Error:
Here's some context:
- Using Azure CLI (Az login): The creation of the external location works without any issues when I authenticate using Az login.
- Using Service Principal: The error occurs when I switch to using a service principal for authentication in my Terraform code.
Here is a snippet of my Terraform provider configuration:
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
}
databricks = {
source = "databricks/databricks"
version = "1.46.0"
}
}
}
provider "azurerm" {
skip_provider_registration="true"
features {}
subscription_id = var.subscription_id
client_id = var.client_id
client_secret = var.client_secret
tenant_id = var.tenant_id
}
// Provider for databricks account
provider "databricks" {
alias = "azure_account"
host = "https://accounts.azuredatabricks.net"
account_id = var.account_id
client_id = var.client_id
client_secret = var.db_client_secret
}
// Provider for databricks workspace
provider "databricks" {
alias = "Workspace"
host = local.databricks_workspace_host
client_id = var.client_id
client_secret = var.db_client_secret
}
resource "databricks_storage_credential" "external_mi" {
provider = databricks.Workspace
name = var.storage_credential_name
azure_managed_identity {
access_connector_id = module.metastore_and_users.azurerm_databricks_access_connector_id
}
owner = var.owner
comment = "Storage credential for all external locations"
depends_on = [module.metastore_and_users.databricks_metastore_assignment]
}
output "storage_credential_result" {
value = {
storage1 = databricks_storage_credential.external_mi.name
storage2 = databricks_storage_credential.external_mi.owner
storage3 = databricks_storage_credential.external_mi.azure_managed_identity
}
}
// Task011 Create external location to be used as root storage by dev catalog
resource "databricks_external_location" "dev_location" {
provider = databricks.Workspace
name = var.external_location_name
url = format("abfss://%s@%s.dfs.core.windows.net/",azurerm_storage_container.dev_catalog.name,
module.metastore_and_users.azurerm_storage_account_unity_catalog.name)
credential_name = databricks_storage_credential.external_mi.id
owner = var.owner
comment = "External location used by dev catalog as root storage"
}
Can anyone provide guidance on:
- The correct way to grant the CREATE EXTERNAL LOCATION permission to a service principal in Databricks?
- Any additional roles or permissions the service principal might need to successfully create an external location?
- Any potential misconfigurations in my Terraform setup that could be causing this issue?
Thanks in advance for your help!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-27-2024 10:53 PM
There is only one reason for this. The service principal that you use does not have the right grants set on the Metastore level.
Are you using the same service principal in your az-login? Because then it is very strange.
Have you created the Metastore with Terraform as well? or Manually?
If manually, you'll have to manually grant the service principal you're using for deploying resources the right priviliges (grants).
If not manually, you can use databricks_grants to grant CREATE_EXTERNAL_LOCATION (and/or others) priviliges to the service principal
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-27-2024 11:29 PM
Yes, we are using same service principle in AZ login.
Az login Provider Configuration:
// Provider for databricks account
provider "databricks" {
alias = "azure_account"
host = "https://accounts.azuredatabricks.net"
account_id = var.account_id
auth_type = "azure-cli"
}
// Provider for databricks workspace
provider "databricks" {
host = local.databricks_workspace_host
}
We are able to create metastore with Terraform using same service principle and below provider configuration.
Thanks in advance for your help!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2024 12:50 AM - edited 06-28-2024 12:51 AM
After creating the databricks_metastore resource, can you run databricks_grants? like this
resource "databricks_grants" "foo" {
depends_on = databricks_metastore.foo
metastore = databricks_metastore.foo.id
grant {
principal = <your service principal>
privileges = ["CREATE_EXTERNAL_LOCATION", <other priviliges>]
}
}

