01-10-2024 08:24 AM
Good Evening,
I am configuring databricks_mws_credentials through Terraform on AWS. I am getting the following error:
Error: cannot create mws credentials: invalid Databricks Account configuration
│
│ with module.databricks.databricks_mws_credentials.this,
│ on modules/aws-databricks-tf/main.tf line 128, in resource "databricks_mws_credentials" "this":
│ 128: resource "databricks_mws_credentials" "this" {
│
I have checked my account username, password and account-ID which are all correct. But i am getting the same error. Anything i am missing
Below is my code:
resource "databricks_mws_credentials" "this" {
#provider = databricks.mws
account_id = var.databricks_account_id
# role_arn = aws_iam_role.cross_account_role.arn
role_arn = var.databricks_role_arn
credentials_name = var.databricks_credentials
# depends_on = [aws_iam_role_policy.this]
}
module "databricks" {
source = "./modules/aws-databricks-tf"
region = var.region
databricks_vpc_id = element(data.aws_vpcs.vpc_list.ids, 0)
databricks_subnet_ids = data.aws_subnets.subnet_ids_private.ids # Private Subnet 1/2
databricks_security_group_ids = [aws_security_group.databricksNode-sg.id]
env = var.env
databricks_account_username = "my-username"
databricks_account_password = "mypassword"
databricks_account_id = "account-id"
databricks_root_storage_bucket = "root-bucket"
# cross_account_iam_role = "DatabricksRole"
databricks_role_arn = my-databricks-role # Instead of creating the Databricks Role via terraform, I have created it manually in the UI with relevant policies and trust relationship
databricks_credentials = "DatabricksCredentials"
#aws #terraform #Credentials #error
01-18-2024 08:49 AM
Managed to fix the issue by updating the provider.tf while. Had to create a Service Principle token and add that into my provider.tf file.
01-17-2024 08:25 PM
Hi, Looks like formatting error , could you please follow the example code in here and let us know if this helps: https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_credentials
01-18-2024 04:00 AM
Hi @Debayan, this code was working before. i had deleted the mws credential file from my cloudformation stack and wnated to create a new one. when i tried to create a new mws credential and link it to a new workspace it would give me the above error. I have provided the whole file below which create the mws credential and workspace
resource "aws_kms_key" "this" {
enable_key_rotation = true
}
resource "aws_s3_bucket" "root_storage_bucket" {
bucket = var.databricks_root_storage_bucket
force_destroy = true
}
resource "aws_s3_bucket_server_side_encryption_configuration" "s3_bucket_encryption" {
bucket = var.databricks_root_storage_bucket
rule {
apply_server_side_encryption_by_default {
kms_master_key_id = aws_kms_key.this.arn
sse_algorithm = "aws:kms"
}
}
}
resource "aws_s3_bucket_versioning" "root_bucket" {
bucket = var.databricks_root_storage_bucket
versioning_configuration {
status = "Enabled"
}
}
resource "aws_s3_bucket_ownership_controls" "root_boot_ownership" {
bucket = var.databricks_root_storage_bucket
rule {
object_ownership = "BucketOwnerPreferred"
}
}
resource "aws_s3_bucket_acl" "root_bucket_acl" {
depends_on = [aws_s3_bucket_ownership_controls.root_boot_ownership]
bucket = var.databricks_root_storage_bucket
acl = "private"
}
resource "aws_s3_bucket_logging" "root_bucket_logging" {
bucket = var.databricks_root_storage_bucket
target_bucket = var.logging_target_bucket
target_prefix = var.logging_target_prefix
}
resource "aws_s3_bucket_public_access_block" "root_storage_bucket" {
bucket = aws_s3_bucket.root_storage_bucket.id
block_public_acls = true
block_public_policy = true
ignore_public_acls = true
restrict_public_buckets = true
depends_on = [aws_s3_bucket.root_storage_bucket]
}
data "databricks_aws_bucket_policy" "this" {
bucket = aws_s3_bucket.root_storage_bucket.bucket
}
resource "aws_s3_bucket_policy" "root" {
bucket = aws_s3_bucket.root_storage_bucket.id
policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::account-id:root"
},
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:PutObject",
"s3:DeleteObject",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::${var.env}databricks/*",
"arn:aws:s3:::${var.env}atabricks"
]
}
]
}
POLICY
}
resource "databricks_mws_networks" "this" {
provider = databricks.mws
account_id = var.databricks_account_id
network_name = var.databricks_network_name
vpc_id = var.databricks_vpc_id
subnet_ids = flatten(var.databricks_subnet_ids)
security_group_ids = var.databricks_security_group_ids
}
resource "databricks_mws_storage_configurations" "this" {
provider = databricks.mws
account_id = var.databricks_account_id
bucket_name = aws_s3_bucket.root_storage_bucket.bucket
storage_configuration_name = var.databricks_root_storage_bucket
}
resource "databricks_mws_credentials" "this" {
#provider = databricks.mws
account_id = var.databricks_account_id
# role_arn = aws_iam_role.cross_account_role.arn
role_arn = var.databricks_role_arn
credentials_name = var.databricks_credentials
# depends_on = [aws_iam_role_policy.this]
}
# resource "databricks_mws_workspaces" "this" {
# provider = databricks.mws
# account_id = var.databricks_account_id
# aws_region = var.region
# workspace_name = var.workspace_name
# # deployment_name = var.workspace_name
# credentials_id = databricks_mws_credentials.this.credentials_id
# storage_configuration_id = databricks_mws_storage_configurations.this.storage_configuration_id
# network_id = databricks_mws_networks.this.network_id
# }
01-18-2024 08:49 AM
Managed to fix the issue by updating the provider.tf while. Had to create a Service Principle token and add that into my provider.tf file.
05-20-2024 07:24 AM
just to add a context for probably the underlying issue requiring an account level service principal (with OAuth).
I experienced the same issue while using username and password as in the case how TF provider was configured for existing workspaces created prior to 11/2023. Looks like Databricks is expecting/enforcing an account level service principal for TF provider for new workspaces after 11/23.
07-29-2024 07:48 AM
Hello, I'm facing a similaire Issue. I try to update my TF with properly authentification and I have this error ?!
╷
│ Error: cannot create mws credentials: failed visitor: context canceled
│
│ with databricks_mws_credentials.this,
│ on main.tf line 8, in resource "databricks_mws_credentials" "this":
How you add context and what it is ?
08-02-2024 10:42 AM
Hello Alexandre467,
Not sure what your issue is. The reference to "context" in my earlier reply was referring to the situation as described in the second paragraph.
If you add a little bit of details as to what you have configured and the issue, then I can comment on:
FYI, here're the workspace and account level provider configs as examples. Please pay attention to the Databricks Terraform provider documentation as to which one you need for the resource you are trying to deal with:
provider "databricks" {
# Accunt Level resources such as workspaces
alias = "mws"
host = var.databricks_host // https://accounts.cloud.databricks.com
account_id = var.databricks_account_id
client_id = var.databricks_tfe_client_id
client_secret = var.databricks_tfe_client_secret
}
provider "databricks" {
# workspace level but using account level credential
alias = "databricks-ws"
host = var.databricks_domain_for_workspace // https://xyz-ws_name.cloud.databricks.com
client_id = var.databricks_tfe_client_id
client_secret = var.databricks_tfe_client_secret
}
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group