cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error: cannot create mws credentials: invalid Databricks Account configuration

MYB24
New Contributor III

Good Evening, 

I am configuring databricks_mws_credentials through Terraform on AWS.  I am getting the following error:

Error: cannot create mws credentials: invalid Databricks Account configuration

│ with module.databricks.databricks_mws_credentials.this,
│ on modules/aws-databricks-tf/main.tf line 128, in resource "databricks_mws_credentials" "this":
│ 128: resource "databricks_mws_credentials" "this" {


I have checked my account username, password and account-ID which are all correct. But i am getting the same error. Anything i am missing 

Below is my code: 

resource "databricks_mws_credentials" "this" {

#provider = databricks.mws

account_id = var.databricks_account_id

# role_arn = aws_iam_role.cross_account_role.arn

role_arn = var.databricks_role_arn

credentials_name = var.databricks_credentials

# depends_on = [aws_iam_role_policy.this]

}

 

 

module "databricks" {

source = "./modules/aws-databricks-tf"

 

region = var.region

databricks_vpc_id = element(data.aws_vpcs.vpc_list.ids, 0)

databricks_subnet_ids = data.aws_subnets.subnet_ids_private.ids # Private Subnet 1/2

databricks_security_group_ids = [aws_security_group.databricksNode-sg.id]

env = var.env

databricks_account_username = "my-username"

databricks_account_password = "mypassword"

databricks_account_id = "account-id"

databricks_root_storage_bucket = "root-bucket"

 

# cross_account_iam_role = "DatabricksRole"

databricks_role_arn = my-databricks-role # Instead of creating the Databricks Role via terraform, I have created it manually in the UI with relevant policies and trust relationship

databricks_credentials = "DatabricksCredentials"

 

#aws #terraform #Credentials #error

 

1 ACCEPTED SOLUTION

Accepted Solutions

MYB24
New Contributor III

Managed to fix the issue by updating the provider.tf while. Had to create a Service Principle token and add that into my provider.tf file. 

provider "databricks" {
alias = "accounts"
client_id = "service-principle-id"
client_secret = "service-principle-secret"
account_id = "databricks-account-id"

View solution in original post

5 REPLIES 5

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, Looks like formatting error , could you please follow the example code in here and let us know if this helps: https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_credentials

MYB24
New Contributor III

Hi @Debayan, this code was working before. i had deleted the mws credential file from my cloudformation stack and wnated to create a new one. when i tried to create a new mws credential and link it to a new workspace it would give me the above error. I have provided the whole file below which create the mws credential and workspace


resource "aws_kms_key" "this" {
enable_key_rotation = true
}


resource "aws_s3_bucket" "root_storage_bucket" {
bucket = var.databricks_root_storage_bucket
force_destroy = true
}

resource "aws_s3_bucket_server_side_encryption_configuration" "s3_bucket_encryption" {
bucket = var.databricks_root_storage_bucket

rule {
apply_server_side_encryption_by_default {
kms_master_key_id = aws_kms_key.this.arn
sse_algorithm = "aws:kms"
}
}
}


resource "aws_s3_bucket_versioning" "root_bucket" {
bucket = var.databricks_root_storage_bucket

versioning_configuration {
status = "Enabled"
}
}

resource "aws_s3_bucket_ownership_controls" "root_boot_ownership" {
bucket = var.databricks_root_storage_bucket
rule {
object_ownership = "BucketOwnerPreferred"
}
}

resource "aws_s3_bucket_acl" "root_bucket_acl" {
depends_on = [aws_s3_bucket_ownership_controls.root_boot_ownership]

bucket = var.databricks_root_storage_bucket
acl = "private"
}

resource "aws_s3_bucket_logging" "root_bucket_logging" {
bucket = var.databricks_root_storage_bucket

target_bucket = var.logging_target_bucket
target_prefix = var.logging_target_prefix
}

resource "aws_s3_bucket_public_access_block" "root_storage_bucket" {
bucket = aws_s3_bucket.root_storage_bucket.id
block_public_acls = true
block_public_policy = true
ignore_public_acls = true
restrict_public_buckets = true
depends_on = [aws_s3_bucket.root_storage_bucket]
}

data "databricks_aws_bucket_policy" "this" {
bucket = aws_s3_bucket.root_storage_bucket.bucket
}

resource "aws_s3_bucket_policy" "root" {
bucket = aws_s3_bucket.root_storage_bucket.id
policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::account-id:root"
},
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:PutObject",
"s3:DeleteObject",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::${var.env}databricks/*",
"arn:aws:s3:::${var.env}atabricks"
]
}
]
}
POLICY
}


resource "databricks_mws_networks" "this" {
provider = databricks.mws
account_id = var.databricks_account_id
network_name = var.databricks_network_name
vpc_id = var.databricks_vpc_id
subnet_ids = flatten(var.databricks_subnet_ids)
security_group_ids = var.databricks_security_group_ids
}

resource "databricks_mws_storage_configurations" "this" {
provider = databricks.mws
account_id = var.databricks_account_id
bucket_name = aws_s3_bucket.root_storage_bucket.bucket
storage_configuration_name = var.databricks_root_storage_bucket
}

 

resource "databricks_mws_credentials" "this" {
#provider = databricks.mws
account_id = var.databricks_account_id
# role_arn = aws_iam_role.cross_account_role.arn
role_arn = var.databricks_role_arn
credentials_name = var.databricks_credentials
# depends_on = [aws_iam_role_policy.this]
}

# resource "databricks_mws_workspaces" "this" {
# provider = databricks.mws
# account_id = var.databricks_account_id
# aws_region = var.region
# workspace_name = var.workspace_name
# # deployment_name = var.workspace_name

# credentials_id = databricks_mws_credentials.this.credentials_id
# storage_configuration_id = databricks_mws_storage_configurations.this.storage_configuration_id
# network_id = databricks_mws_networks.this.network_id
# }

 

Kaniz
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

MYB24
New Contributor III

Managed to fix the issue by updating the provider.tf while. Had to create a Service Principle token and add that into my provider.tf file. 

provider "databricks" {
alias = "accounts"
client_id = "service-principle-id"
client_secret = "service-principle-secret"
account_id = "databricks-account-id"

TMD
New Contributor III

just to add a context for probably the underlying issue requiring an account level service principal (with OAuth).

I experienced the same issue while using username and password as in the case how TF provider was configured for existing workspaces created prior to 11/2023.  Looks like Databricks is expecting/enforcing an account level service principal for TF provider for new workspaces after 11/23.

 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!