cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I use Terraform to assign an external location to multiple workspaces?

VicS
Contributor

How can I use Terraform to assign an external location to multiple workspaces?

When I create an external location with Terraform, I do not see any option to directly link workspaces. it also only links to the workspace of the databricks profile that I use to deploy Terraform.

I can assign a second workspace manually with the UI, but how do I do it with Terraform?

VicS_0-1744880469889.png

 

My current Terraform code looks like this:

resource "databricks_external_location" "extloc" {
name = "external-location-name"
url = "abfss://%s@%s.dfs.core.windows.net"
credential_name = "xxxxxxx"
owner = "xxxxxxx"
isolation_mode = "ISOLATION_MODE_ISOLATED"
comment = "Created by Terraform"
}

resource "databricks_grants" "ggggggg" {
external_location = databricks_external_location.extloc.id
grant {
principal = "xyz"
privileges = ["ALL_PRIVILEGES"]
}
grant {
principal = "abc"
privileges = ["ALL_PRIVILEGES"]
}
depends_on = [databricks_external_location.extloc]
}

 

1 ACCEPTED SOLUTION

Accepted Solutions

TheRealOliver
Contributor

@Walter_C I think you need to use databricks_workspace_binding resource for that multi-workspace binding.  I was able to achieve it in Terraform. The resource docs seem to agree with result that I have. My Databricks runs on Google Cloud.

My Terraform configuration: GitHub Gist

@VicS I think the reason your external location gets assigned to the "current" workspace - by which I mean the workspace you referenced in your Databricks Terraform provider configuration (host parameter) - is because you've specified 

isolation_mode = "ISOLATION_MODE_ISOLATED"

By default, Databricks assigns the securable to all workspaces attached to the current metastore but when you set this parameter then you get the behaviour that you see in your case. At least that's what databricks_external_location  docs say.

View solution in original post

6 REPLIES 6

Walter_C
Databricks Employee
Databricks Employee

Currently, managing multi-workspace bindings for external locations entirely through Terraform is a limitation.

TheRealOliver
Contributor

@Walter_C I think you need to use databricks_workspace_binding resource for that multi-workspace binding.  I was able to achieve it in Terraform. The resource docs seem to agree with result that I have. My Databricks runs on Google Cloud.

My Terraform configuration: GitHub Gist

@VicS I think the reason your external location gets assigned to the "current" workspace - by which I mean the workspace you referenced in your Databricks Terraform provider configuration (host parameter) - is because you've specified 

isolation_mode = "ISOLATION_MODE_ISOLATED"

By default, Databricks assigns the securable to all workspaces attached to the current metastore but when you set this parameter then you get the behaviour that you see in your case. At least that's what databricks_external_location  docs say.

 The isolation mode setting was required as I do not want to give access to every other workspace, only those specific two. Your solution to extend the access to another workspace with the "databricks_workspace_binding" worked though, thank you!

I can't create two databricks_external_location for the same external location (within one metastore), so that won't work. 

But I don't have two `databricks_external_location`s in my Terraform? I create two bindings to a single location.

It was in response to the comment by "omwer21s" 🙂 Your solution works.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now