โ08-14-2024 06:56 AM
Hi ,
I am getting an error (Metastore has reached the limit for metastores in region ) while updating metatsore using terraform , below the script i am using for updating metastore . There is already a metastore available i dont want to create the new one, i just want to update some property using terraform .
โ08-14-2024 09:25 PM
Hi Shibi,
Please give a try with below code and let me know if the error still persists.
resource "databricks_metastore" "example" {
provider = databricks.azure_account
metastore_id = "your_existing_metastore_id" # Put your metastore ID here
delta_sharing_scope = "INTERNAL_AND_EXTERNAL"
delta_sharing_recipient_token_lifetime_in_seconds = 600000
}
Keep trying! Happy Data engineering.
Regards,
Brahma
โ08-15-2024 07:26 AM
Hi , Still i am getting same issue " has reached the limit for metastores in region "
Regards,
Shibin
โ08-15-2024 08:18 AM
Hi Shibhi,
Try adding metastore_id to your workspace.
resource "databricks_metastore_assignment" "example" {
provider = databricks.azure_account
workspace_id = "your_workspace_id" # Replace with your actual workspace ID
metastore_id = "your_existing_metastore_id" # Ensure this matches the metastore you want to use
}
Save the Terraform file with the above code.
Run terraform to update the metastore or assignment.
โ08-15-2024 08:25 AM
Hi ,
When it getting created the databrikcs workspace its automatically assigning to the metastore, I was looking to update the metastor delta sharing , And when we user "databricks_metastore" , "Name" parameter is mandatory
โ08-15-2024 09:01 AM
Oh okay, Shibi.
give a try with this approach below.
resource "databricks_metastore" "example" {
provider = databricks.azure_account
metastore_id = "your_existing_metastore_id" # Ensure this is the correct metastore ID
name = "metastore_azure_northeurope" # This should match the existing metastore's name
delta_sharing_scope = "INTERNAL_AND_EXTERNAL"
delta_sharing_recipient_token_lifetime_in_seconds = 600000
}
To avoid creating the metastore again , follow below.
The metastore_id and name match exactly with the existing metastore in your Databricks environment. Double-check that the provider configuration is correctly pointing to your existing Databricks environment.
Give a try and let me know if it works.
Regards,
Brahma
โ08-15-2024 10:52 AM
Hi, Thank you for the suggestions,
But i am still getting same error .
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group