cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

error creating token when creating databricks_mws_workspace resource on GCP

yurib
New Contributor III

 

resource "databricks_mws_workspaces" "this" {
  depends_on = [ databricks_mws_networks.this ]
  provider = databricks.account
  account_id = var.databricks_account_id
  workspace_name = "${local.prefix}-dbx-ws"
  location = var.google_region

  cloud_resource_container {
    gcp {
      project_id = var.google_project
    }
  }
  private_access_settings_id = var.databricks_pas_id 
  network_id = databricks_mws_networks.this.network_id
  gke_config {
    connectivity_type = "PRIVATE_NODE_PUBLIC_MASTER"
    master_ip_range = var.mws_workspace_gke_master_ip_range
  }
  token {}
  pricing_tier = "PREMIUM"
}
...
โ”‚ Error: cannot create mws workspaces: cannot create token: failed during request visitor: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/auth.html#databricks-client-unified-authentication to configure credentials for your preferred authentication method. Config: host=https://xxxx.gcp.databricks.com, google_service_account=xxxโ€‹

 

The workspace appears to be created - I can interact with it in the Databricks account console and with the respective google resources via google cloud console, but the `terraform apply` command fails with the above error.

`terraform destroy` fails with a similar error about reading the token.

The errors go away if I authenticate with the newly created workspace (databricks auth login --host https://xxx.gcp.databricks.com  ) but at that point terraform marks the workspace resource as tainted, forcing re-creating it and producing the same error for the yet newer workspace.

versions / provider definition:

 

terraform {
  required_providers {
    databricks = {
      source  = "databricks/databricks"
      version = "1.48.3"
      configuration_aliases = [ databricks.account, databricks.workspace ]
    }
  }
}

provider "databricks" {
  alias                  = "account"
  host                   = var.account_console_url
  account_id             = var.databricks_account_id
  google_service_account = var.databricks_google_service_account
}

provider "databricks" {
  alias                  = "workspace"
  host                   = databricks_mws_workspaces.this.workspace_url
  token                  = databricks_mws_workspaces.this.token[0].token_value
}

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

yurib
New Contributor III

my issue was caused be credentials in `~/.databrickscfg` (generated by databricks cli) taking precedence over the creds set by `gcloud auth application-default login`. google's application default creds should be used when using the databricks google service account for deploying resources. deleting / renaming ~/.databrickscfg resolved my problem.

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @yurib

  • Ensure that youโ€™re using the correct authentication method. For Azure Databricks, service principal authentication is recommended. You can create a separate provider instance in Terraform to authenticate using service principal credentials.
  • Make sure youโ€™ve set up the necessary environment variables (e.g., DATABRICKS_HOST and DATABRICKS_TOKEN) correctly. 
  • If youโ€™re using a personal access token, ensure that itโ€™s correctly propagated within your pipeline.

yurib
New Contributor III

my issue was caused be credentials in `~/.databrickscfg` (generated by databricks cli) taking precedence over the creds set by `gcloud auth application-default login`. google's application default creds should be used when using the databricks google service account for deploying resources. deleting / renaming ~/.databrickscfg resolved my problem.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group