โ10-24-2022 10:03 AM
Hi, We have some problems to create some resources using terraform, after the Unity catalog migration. We have created a group and SVC Principal under account as in the doc using terraform. (AWS infrastructure) (https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/group)
resource "databricks_group" "group_account_level" {
display_name = "databricks_deployments"
allow_cluster_create = true
allow_instance_pool_create = true
workspace_access = true
provider = databricks.account_level
}
resource "databricks_service_principal" "spark_account_level" {
display_name = "SVC_SPARK"
provider = databricks.account_level
}
resource "databricks_group_member" "spark_deployments" {
group_id = databricks_group.group_account_level.id
member_id = databricks_service_principal.spark_account_level.id
provider = databricks.account_level
}
The provider uses the databricks host as in the documentation. (https://accounts.cloud.databricks.com)
We need to use the SVC Principal with the following resources:
All failed I tried the following combinations:
. Using the databricks provider used to create the service principal at account level:
resource "databricks_group_role" "instance_profile_group_deployments" {
group_id = databricks_group.group_account_level.id
role = databricks_instance_profile.ds.id
provider = databricks.account_level
}
Looks like the API is not defined under account level
. Using the provider under workspace
resource "databricks_group_role" "instance_profile_group_deployments" {
group_id = databricks_group.group_account_level.id
role = databricks_instance_profile.ds.id
provider = databricks.workspace_url
}
then it complains can not find the group with the id XXXX, however the group exists but under account level
Any help will be appreciated
โ11-01-2022 02:40 AM
Hi @Jordi Casanellaโ ,
I have been working with terraform for databricks lately and I would say that I had to switch my approach couple of times due to issues like you have right now (account vs workspace level API).
I assume that with this part you didn't have issues and you were able to:
resource "databricks_group" "group_account_level" {
display_name = "databricks_deployments"
allow_cluster_create = true
allow_instance_pool_create = true
workspace_access = true
provider = databricks.account_level
}
resource "databricks_service_principal" "spark_account_level" {
display_name = "SVC_SPARK"
provider = databricks.account_level
}
resource "databricks_group_member" "spark_deployments" {
group_id = databricks_group.group_account_level.id
member_id = databricks_service_principal.spark_account_level.id
provider = databricks.account_level
}
to be able to use the SP on the workspace level, as you have mentioned:
databricks_permissions, databricks_group_role, databrics_secret_acl you need to assign the group to the workspace, you can achieve this using `mws_permission_assignment`:
resource "databricks_mws_permission_assignment" "ws_access" {
provider = databricks.account_level
workspace_id = <YOUR_WORKSPACE_ID>
principal_id = databricks_group.group_account_level.id
permissions = ["USER"]
}
then you can use that group in you workspace:
resource "databricks_permissions" "this" {
provider = databricks.workspace
authorization = "tokens"
access_control {
group_name = databricks_group.group_account_level.display_name
permission_level = "CAN_USE"
}
}
thanks,
Pat.
โ10-25-2022 05:32 AM
do we need to create under account and workspace?
โ11-01-2022 02:40 AM
Hi @Jordi Casanellaโ ,
I have been working with terraform for databricks lately and I would say that I had to switch my approach couple of times due to issues like you have right now (account vs workspace level API).
I assume that with this part you didn't have issues and you were able to:
resource "databricks_group" "group_account_level" {
display_name = "databricks_deployments"
allow_cluster_create = true
allow_instance_pool_create = true
workspace_access = true
provider = databricks.account_level
}
resource "databricks_service_principal" "spark_account_level" {
display_name = "SVC_SPARK"
provider = databricks.account_level
}
resource "databricks_group_member" "spark_deployments" {
group_id = databricks_group.group_account_level.id
member_id = databricks_service_principal.spark_account_level.id
provider = databricks.account_level
}
to be able to use the SP on the workspace level, as you have mentioned:
databricks_permissions, databricks_group_role, databrics_secret_acl you need to assign the group to the workspace, you can achieve this using `mws_permission_assignment`:
resource "databricks_mws_permission_assignment" "ws_access" {
provider = databricks.account_level
workspace_id = <YOUR_WORKSPACE_ID>
principal_id = databricks_group.group_account_level.id
permissions = ["USER"]
}
then you can use that group in you workspace:
resource "databricks_permissions" "this" {
provider = databricks.workspace
authorization = "tokens"
access_control {
group_name = databricks_group.group_account_level.display_name
permission_level = "CAN_USE"
}
}
thanks,
Pat.
โ11-03-2022 05:34 AM
@Pat Sienkiewiczโ After check the terraform github implementation and the API, I found what u did a couple of days ago:
Just adding here the API doc in case can help someone else.
So thanks a lot, you're awesome. ๐
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now