04-17-2025 02:01 AM
How can I use Terraform to assign an external location to multiple workspaces?
When I create an external location with Terraform, I do not see any option to directly link workspaces. it also only links to the workspace of the databricks profile that I use to deploy Terraform.
I can assign a second workspace manually with the UI, but how do I do it with Terraform?
My current Terraform code looks like this:
resource "databricks_external_location" "extloc" {
name = "external-location-name"
url = "abfss://%s@%s.dfs.core.windows.net"
credential_name = "xxxxxxx"
owner = "xxxxxxx"
isolation_mode = "ISOLATION_MODE_ISOLATED"
comment = "Created by Terraform"
}
resource "databricks_grants" "ggggggg" {
external_location = databricks_external_location.extloc.id
grant {
principal = "xyz"
privileges = ["ALL_PRIVILEGES"]
}
grant {
principal = "abc"
privileges = ["ALL_PRIVILEGES"]
}
depends_on = [databricks_external_location.extloc]
}
04-21-2025 06:20 PM
@Walter_C I think you need to use databricks_workspace_binding resource for that multi-workspace binding. I was able to achieve it in Terraform. The resource docs seem to agree with result that I have. My Databricks runs on Google Cloud.
My Terraform configuration: GitHub Gist
@VicS I think the reason your external location gets assigned to the "current" workspace - by which I mean the workspace you referenced in your Databricks Terraform provider configuration (host parameter) - is because you've specified
isolation_mode = "ISOLATION_MODE_ISOLATED"
By default, Databricks assigns the securable to all workspaces attached to the current metastore but when you set this parameter then you get the behaviour that you see in your case. At least that's what databricks_external_location docs say.
04-21-2025 07:36 AM
Currently, managing multi-workspace bindings for external locations entirely through Terraform is a limitation.
04-21-2025 06:20 PM
@Walter_C I think you need to use databricks_workspace_binding resource for that multi-workspace binding. I was able to achieve it in Terraform. The resource docs seem to agree with result that I have. My Databricks runs on Google Cloud.
My Terraform configuration: GitHub Gist
@VicS I think the reason your external location gets assigned to the "current" workspace - by which I mean the workspace you referenced in your Databricks Terraform provider configuration (host parameter) - is because you've specified
isolation_mode = "ISOLATION_MODE_ISOLATED"
By default, Databricks assigns the securable to all workspaces attached to the current metastore but when you set this parameter then you get the behaviour that you see in your case. At least that's what databricks_external_location docs say.
04-21-2025 11:16 PM
The isolation mode setting was required as I do not want to give access to every other workspace, only those specific two. Your solution to extend the access to another workspace with the "databricks_workspace_binding" worked though, thank you!
04-21-2025 10:46 PM
I can't create two databricks_external_location for the same external location (within one metastore), so that won't work.
04-22-2025 04:22 AM
But I don't have two `databricks_external_location`s in my Terraform? I create two bindings to a single location.
04-22-2025 06:47 AM
It was in response to the comment by "omwer21s" 🙂 Your solution works.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now