cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

unity catalog databricks_metastore terraform - cannot configure default credentials

JustLeo
New Contributor III

I'm creating a unity catalog using terraform on Azure. The first couple of apply worked just fine, no error at all, but suddenly today when i wanted to incorporate another change to the code which was not related to databicks, terraform started giving me below error message:

Error: cannot read metastore assignment: failed during request visitor: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/auth.html#databricks-client-unified-authentication to configure credentials for your preferred authentication method. Config: azure_client_secret=***, azure_client_id=<MY_CLUIENT_ID>, azure_tenant_id=<MY_TENANT_ID>. Env: ARM_CLIENT_SECRET, ARM_CLIENT_ID, ARM_TENANT_ID

│ with module.databricks.databricks_metastore_assignment.this,
│ on .terraform/modules/databricks/main.tf line 180, in resource "databricks_metastore_assignment" "this":
│ 180: resource "databricks_metastore_assignment" "this" {

 

In terms of providers, i only have the default for azurerm and one for databricks:

provider "azurerm" {
features {}
}

provider "databricks" {
host = module.databricks.databricks.workspace_url
}

what is strange about this error, is that it doesn't fail during the first couple of apply, it's more like a random error. any idea what could be the root cause of this error? any suggestions? Thanks in advance.

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Contributor III

 

 

 

 

Hi @JustLeo , 

You haven't provided how you defined resource, but maybe you're missing depends on clause?

According to terraform documentation:

In Terraform 0.13 and later, data resources have the same dependency resolution behavior as defined for managed resources. Most data resources make an API call to a workspace. If a workspace doesn't exist yet, default auth: cannot configure default credentials error is raised. To work around this issue and guarantee a proper lazy authentication with data resources, you should add depends_on = [azurerm_databricks_workspace.this] or depends_on = [databricks_mws_workspaces.this] to the body. This issue doesn't occur if a workspace is created in one module and resources within the workspace are created in another. We do not recommend using Terraform 0.12 and earlier if your usage involves data resources.

You can also compare your provider configuration to one below, which comes from official terraform guide. To create metastore and perform metastore assignment to workspace, they're using provider with an alias "accounts":

https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/unity-catalog-azure...

provider "databricks" {
  host = local.databricks_workspace_host
}

provider "databricks" {
  alias      = "accounts"
  host       = "https://accounts.azuredatabricks.net"
  account_id = var.databricks_account_id
}

 

You can also take a look on below thread with kind of similar problem: 

https://community.databricks.com/t5/data-governance/issue-creating-metastore-using-terraform-with-se...

View solution in original post

1 REPLY 1

szymon_dybczak
Contributor III

 

 

 

 

Hi @JustLeo , 

You haven't provided how you defined resource, but maybe you're missing depends on clause?

According to terraform documentation:

In Terraform 0.13 and later, data resources have the same dependency resolution behavior as defined for managed resources. Most data resources make an API call to a workspace. If a workspace doesn't exist yet, default auth: cannot configure default credentials error is raised. To work around this issue and guarantee a proper lazy authentication with data resources, you should add depends_on = [azurerm_databricks_workspace.this] or depends_on = [databricks_mws_workspaces.this] to the body. This issue doesn't occur if a workspace is created in one module and resources within the workspace are created in another. We do not recommend using Terraform 0.12 and earlier if your usage involves data resources.

You can also compare your provider configuration to one below, which comes from official terraform guide. To create metastore and perform metastore assignment to workspace, they're using provider with an alias "accounts":

https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/unity-catalog-azure...

provider "databricks" {
  host = local.databricks_workspace_host
}

provider "databricks" {
  alias      = "accounts"
  host       = "https://accounts.azuredatabricks.net"
  account_id = var.databricks_account_id
}

 

You can also take a look on below thread with kind of similar problem: 

https://community.databricks.com/t5/data-governance/issue-creating-metastore-using-terraform-with-se...

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group