cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Terraform: Add Key Vault Administrator Role Assignment and Save Outputs to JSON Dynamically in Azure

Sudheer2
New Contributor III

Hi everyone,

I am using Terraform to provision an OpenAI service and its modules along with a Key Vault in Azure. While the OpenAI service setup works as expected, I am facing two challenges:

  1. Role Assignment for Key Vault

I need to assign the Key Vault Administratorrole to my service so it can access and manage keys. However, I’m unsure how to implement this using Terraform.

  1. Save Output Variables Dynamically to JSON

After the resources are created, I need to save the following details dynamically in a JSON file:

• openai.api_type

• openai.api_base

• openai.api_version

• openai.api_key

• engine

Here is a snippet of my current Terraform code:

terraform {

  backend "local" { path = "terraform-example1.tfstate" }

}

provider "azurerm" {

  features {

    key_vault {

      purge_soft_delete_on_destroy = true

    }

  }

  client_id       = var.client_id

  client_secret   = var.client_secret

  tenant_id       = var.tenant_id

  subscription_id = var.subscription_id

}

resource "azurerm_resource_group" "rg" {

  name     = var.resource_group_name

  location = var.location

}

module "openai" {

  source  = "Pwd9000-ML/openai-service/azurerm"

  version = ">= 1.1.0"

  location = var.location

  keyvault_resource_group_name                 = azurerm_resource_group.rg.name

  kv_config                                    = var.kv_config

  keyvault_firewall_default_action             = var.keyvault_firewall_default_action

  keyvault_firewall_bypass                     = var.keyvault_firewall_bypass

  keyvault_firewall_allowed_ips                = var.keyvault_firewall_allowed_ips

  keyvault_firewall_virtual_network_subnet_ids = var.keyvault_firewall_virtual_network_subnet_ids

  create_openai_service                     = var.create_openai_service

  openai_resource_group_name                = azurerm_resource_group.rg.name

  openai_account_name                       = var.openai_account_name

  openai_custom_subdomain_name              = var.openai_custom_subdomain_name

  openai_sku_name                           = var.openai_sku_name

  openai_local_auth_enabled                 = var.openai_local_auth_enabled

  openai_outbound_network_access_restricted = var.openai_outbound_network_access_restricted

  openai_public_network_access_enabled      = var.openai_public_network_access_enabled

  openai_identity                           = var.openai_identity

  create_model_deployment = var.create_model_deployment

  model_deployment        = var.model_deployment

}

Questions:

  1. How can I add a Key Vault Administratorrole assignment for the service using Terraform?
  2. What is the best way to save output variables dynamically to a JSON file after the resources are created?

Any help or examples would be greatly appreciated!

Thanks in advance!

2 REPLIES 2

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @Sudheer2,

For your first question.To assign the Key Vault Administrator role to your service using Terraform, you can use the azurerm_role_assignment resource. Here is an example of how you can add this to your Terraform configuration:

resource "azurerm_role_assignment" "key_vault_admin" {

  scope                = azurerm_key_vault.example.id

  role_definition_name = "Key Vault Administrator"

  principal_id         = azurerm_user_assigned_identity.example.principal_id

}

In this example, replace azurerm_key_vault.example.id with the actual ID of your Key Vault and azurerm_user_assigned_identity.example.principal_id with the principal ID of the service that needs the role assignment

Alberto_Umana
Databricks Employee
Databricks Employee

For question two, you can use the local_file resource in Terraform:

output "openai_api_type" {
value = module.openai.api_type
}

output "openai_api_base" {
value = module.openai.api_base
}

output "openai_api_version" {
value = module.openai.api_version
}

output "openai_api_key" {
value = module.openai.api_key
}

output "engine" {
value = module.openai.engine
}

resource "local_file" "output_json" {
content = jsonencode({
openai_api_type = module.openai.api_type,
openai_api_base = module.openai.api_base,
openai_api_version = module.openai.api_version,
openai_api_key = module.openai.api_key,
engine = module.openai.engine
})
filename = "${path.module}/output.json"
}

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group