Hello all, I having an issue which after trying different things, still not able to find the root cause of the problem.
I have my Teraform databrick provider configure like this:
Databricks provider version= v1.49.1
provider "databricks" {
host = module.databricks.databricks.workspace_url
}
and then my "databricks_storage_credential" resource looks like this:
resource "databricks_storage_credential" "external" {
name = azurerm_databricks_access_connector.unity.name
azure_managed_identity {
access_connector_id = azurerm_databricks_access_connector.unity.id
}
isolation_mode = "ISOLATION_MODE_ISOLATED"
comment = "Managed by TF"
}
I have no problem with provisioning or destruction. I can even deploy later a cluster within Databricks workspace, but on certain updates, like tag update or adding a new RG, I'm getting below error:
Error: cannot read storage credential: failed during request visitor: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/auth.html#databricks-client-unified-authentication to configure credentials for your preferred authentication method. Config: azure_client_secret=***, azure_client_id=<CLIENT_ID>, azure_tenant_id=<TENANT_ID>. Env: ARM_CLIENT_SECRET, ARM_CLIENT_ID, ARM_TENANT_ID
│
│ with module.databricks.databricks_storage_credential.external,
│ on modules/terraform-azurerm-databricks-workspace/main.tf line 191, in resource "databricks_storage_credential" "external":
│ 191: resource "databricks_storage_credential" "external" {
Looks to me that I'm missing some read permission somewhere. Any advise? Any help will be more than welcome 🙂
Thanks in advance...