cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error: cannot create mws credentials: invalid Databricks Account configuration

MYB24
New Contributor III

Good Evening, 

I am configuring databricks_mws_credentials through Terraform on AWS.  I am getting the following error:

Error: cannot create mws credentials: invalid Databricks Account configuration

│ with module.databricks.databricks_mws_credentials.this,
│ on modules/aws-databricks-tf/main.tf line 128, in resource "databricks_mws_credentials" "this":
│ 128: resource "databricks_mws_credentials" "this" {


I have checked my account username, password and account-ID which are all correct. But i am getting the same error. Anything i am missing 

Below is my code: 

resource "databricks_mws_credentials" "this" {

#provider = databricks.mws

account_id = var.databricks_account_id

# role_arn = aws_iam_role.cross_account_role.arn

role_arn = var.databricks_role_arn

credentials_name = var.databricks_credentials

# depends_on = [aws_iam_role_policy.this]

}

 

 

module "databricks" {

source = "./modules/aws-databricks-tf"

 

region = var.region

databricks_vpc_id = element(data.aws_vpcs.vpc_list.ids, 0)

databricks_subnet_ids = data.aws_subnets.subnet_ids_private.ids # Private Subnet 1/2

databricks_security_group_ids = [aws_security_group.databricksNode-sg.id]

env = var.env

databricks_account_username = "my-username"

databricks_account_password = "mypassword"

databricks_account_id = "account-id"

databricks_root_storage_bucket = "root-bucket"

 

# cross_account_iam_role = "DatabricksRole"

databricks_role_arn = my-databricks-role # Instead of creating the Databricks Role via terraform, I have created it manually in the UI with relevant policies and trust relationship

databricks_credentials = "DatabricksCredentials"

 

#aws #terraform #Credentials #error

 

1 ACCEPTED SOLUTION

Accepted Solutions

MYB24
New Contributor III

Managed to fix the issue by updating the provider.tf while. Had to create a Service Principle token and add that into my provider.tf file. 

provider "databricks" {
alias = "accounts"
client_id = "service-principle-id"
client_secret = "service-principle-secret"
account_id = "databricks-account-id"

View solution in original post

6 REPLIES 6

Debayan
Databricks Employee
Databricks Employee

Hi, Looks like formatting error , could you please follow the example code in here and let us know if this helps: https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_credentials

MYB24
New Contributor III

Hi @Debayan, this code was working before. i had deleted the mws credential file from my cloudformation stack and wnated to create a new one. when i tried to create a new mws credential and link it to a new workspace it would give me the above error. I have provided the whole file below which create the mws credential and workspace


resource "aws_kms_key" "this" {
enable_key_rotation = true
}


resource "aws_s3_bucket" "root_storage_bucket" {
bucket = var.databricks_root_storage_bucket
force_destroy = true
}

resource "aws_s3_bucket_server_side_encryption_configuration" "s3_bucket_encryption" {
bucket = var.databricks_root_storage_bucket

rule {
apply_server_side_encryption_by_default {
kms_master_key_id = aws_kms_key.this.arn
sse_algorithm = "aws:kms"
}
}
}


resource "aws_s3_bucket_versioning" "root_bucket" {
bucket = var.databricks_root_storage_bucket

versioning_configuration {
status = "Enabled"
}
}

resource "aws_s3_bucket_ownership_controls" "root_boot_ownership" {
bucket = var.databricks_root_storage_bucket
rule {
object_ownership = "BucketOwnerPreferred"
}
}

resource "aws_s3_bucket_acl" "root_bucket_acl" {
depends_on = [aws_s3_bucket_ownership_controls.root_boot_ownership]

bucket = var.databricks_root_storage_bucket
acl = "private"
}

resource "aws_s3_bucket_logging" "root_bucket_logging" {
bucket = var.databricks_root_storage_bucket

target_bucket = var.logging_target_bucket
target_prefix = var.logging_target_prefix
}

resource "aws_s3_bucket_public_access_block" "root_storage_bucket" {
bucket = aws_s3_bucket.root_storage_bucket.id
block_public_acls = true
block_public_policy = true
ignore_public_acls = true
restrict_public_buckets = true
depends_on = [aws_s3_bucket.root_storage_bucket]
}

data "databricks_aws_bucket_policy" "this" {
bucket = aws_s3_bucket.root_storage_bucket.bucket
}

resource "aws_s3_bucket_policy" "root" {
bucket = aws_s3_bucket.root_storage_bucket.id
policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::account-id:root"
},
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:PutObject",
"s3:DeleteObject",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::${var.env}databricks/*",
"arn:aws:s3:::${var.env}atabricks"
]
}
]
}
POLICY
}


resource "databricks_mws_networks" "this" {
provider = databricks.mws
account_id = var.databricks_account_id
network_name = var.databricks_network_name
vpc_id = var.databricks_vpc_id
subnet_ids = flatten(var.databricks_subnet_ids)
security_group_ids = var.databricks_security_group_ids
}

resource "databricks_mws_storage_configurations" "this" {
provider = databricks.mws
account_id = var.databricks_account_id
bucket_name = aws_s3_bucket.root_storage_bucket.bucket
storage_configuration_name = var.databricks_root_storage_bucket
}

 

resource "databricks_mws_credentials" "this" {
#provider = databricks.mws
account_id = var.databricks_account_id
# role_arn = aws_iam_role.cross_account_role.arn
role_arn = var.databricks_role_arn
credentials_name = var.databricks_credentials
# depends_on = [aws_iam_role_policy.this]
}

# resource "databricks_mws_workspaces" "this" {
# provider = databricks.mws
# account_id = var.databricks_account_id
# aws_region = var.region
# workspace_name = var.workspace_name
# # deployment_name = var.workspace_name

# credentials_id = databricks_mws_credentials.this.credentials_id
# storage_configuration_id = databricks_mws_storage_configurations.this.storage_configuration_id
# network_id = databricks_mws_networks.this.network_id
# }

 

MYB24
New Contributor III

Managed to fix the issue by updating the provider.tf while. Had to create a Service Principle token and add that into my provider.tf file. 

provider "databricks" {
alias = "accounts"
client_id = "service-principle-id"
client_secret = "service-principle-secret"
account_id = "databricks-account-id"

TMD
New Contributor III

just to add a context for probably the underlying issue requiring an account level service principal (with OAuth).

I experienced the same issue while using username and password as in the case how TF provider was configured for existing workspaces created prior to 11/2023.  Looks like Databricks is expecting/enforcing an account level service principal for TF provider for new workspaces after 11/23.

 

Alexandre467
New Contributor II

Hello, I'm facing a similaire Issue. I try to update my TF with properly authentification and I have this error ?! 

╷
│ Error: cannot create mws credentials: failed visitor: context canceled
│ 
│   with databricks_mws_credentials.this,
│   on main.tf line 8, in resource "databricks_mws_credentials" "this":

How you add context and what it is ?

TMD
New Contributor III

Hello  Alexandre467,

Not sure what your issue is.  The reference to "context" in my earlier reply was referring to the situation as described in the second paragraph. 

If you add a little bit of details as to what you have configured and the issue, then I can comment on:

FYI, here're the workspace and account level provider configs as examples. Please pay attention to the Databricks Terraform provider documentation as to which one you need for the resource you are trying to deal with:

 

 

provider "databricks" {
  # Accunt Level resources such as workspaces
  alias         = "mws"
  host          = var.databricks_host     // https://accounts.cloud.databricks.com
  account_id    = var.databricks_account_id
  client_id     = var.databricks_tfe_client_id
  client_secret = var.databricks_tfe_client_secret
}

provider "databricks" {
  # workspace level but using account level credential
  alias         = "databricks-ws"
  host          = var.databricks_domain_for_workspace    // https://xyz-ws_name.cloud.databricks.com
  client_id     = var.databricks_tfe_client_id
  client_secret = var.databricks_tfe_client_secret
}

 

 

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group