cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting an error while editing databrics metastore "databricks_metastore" using terraform (Azure)

shibi
New Contributor III

Hi , 

I am getting an error (Metastore has reached the limit for metastores in region ) while updating metatsore using terraform , below the script i am using for updating metastore . There is already a metastore available i dont want to create the new one, i just want to update some property using terraform .

 

resource "databricks_metastore" "example" {
 provider= databricks.azure_account
 name                   = "metastore_azure_northeurope"
 owner = var.databricksGroupName
 delta_sharing_scope    = "INTERNAL_AND_EXTERNAL"
 delta_sharing_recipient_token_lifetime_in_seconds = 600000
}
6 REPLIES 6

Brahmareddy
Valued Contributor III

Hi Shibi,

Please give a try with below code and let me know if the error still persists.

resource "databricks_metastore" "example" {
provider = databricks.azure_account
metastore_id = "your_existing_metastore_id" # Put your metastore ID here
delta_sharing_scope = "INTERNAL_AND_EXTERNAL"
delta_sharing_recipient_token_lifetime_in_seconds = 600000
}

Keep trying! Happy Data engineering.

Regards,

Brahma

shibi
New Contributor III

Hi , Still i am getting same issue " has reached the limit for metastores in region "

Regards,

Shibin 

Brahmareddy
Valued Contributor III

Hi Shibhi,

Try adding metastore_id to your workspace.

resource "databricks_metastore_assignment" "example" {
provider = databricks.azure_account
workspace_id = "your_workspace_id" # Replace with your actual workspace ID
metastore_id = "your_existing_metastore_id" # Ensure this matches the metastore you want to use
}

Save the Terraform file with the above code.

Run terraform to update the metastore or assignment.

 

 

 

shibi
New Contributor III

Hi ,

When it getting created the databrikcs workspace its automatically assigning to the metastore, I was looking to update the metastor delta sharing , And when we user "databricks_metastore" , "Name" parameter is mandatory 

Brahmareddy
Valued Contributor III

Oh okay, Shibi.

give a try with this approach below.

resource "databricks_metastore" "example" {
provider = databricks.azure_account
metastore_id = "your_existing_metastore_id" # Ensure this is the correct metastore ID
name = "metastore_azure_northeurope" # This should match the existing metastore's name
delta_sharing_scope = "INTERNAL_AND_EXTERNAL"
delta_sharing_recipient_token_lifetime_in_seconds = 600000
}

To avoid creating the metastore again , follow below.

The metastore_id and name match exactly with the existing metastore in your Databricks environment. Double-check that the provider configuration is correctly pointing to your existing Databricks environment.

Give a try and let me know if it works.

Regards,

Brahma

 

shibi
New Contributor III

Hi, Thank you for the suggestions, 

But i am still getting same error .

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group