cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Unity Catalog Metastore

TugrulA
New Contributor II

Hey everyone,

I deleted my Unity Catalog metastore and now want to point it to another Azure storage account (ADLS). However, once a metastore is created, its storage location cannot be changed. Therefore, I deleted the existing metastore and created a new one with the path to my new storage account.

My organization—and Databricks Unity Catalog—supports only one region per metastore. Because of this, I had to delete and recreate the metastore. Unfortunately, all objects in the deleted metastore are gone. We still have the metadata in an Azure container, but it’s managed by Databricks and the folders are named with random UUIDs.

I have two questions:

  1. How can I restore all objects? Is it possible for the new metastore (with the same name) to automatically include all of the original objects?

  2. If I need to restore the objects manually, how can I identify which _delta_log metadata folder corresponds to which table?

Thank you in advance.


2 REPLIES 2

sarahbhord
Databricks Employee
Databricks Employee

Hey TugrulA -

1. Deleting a Unity Catalog Metastore permanently removes all associated objects, and the new metastore wont automatically include original objects. Unfortunately automatic recover is not possible. While UC allows UNDROP for individual tables within 7 days, this does not apply to entire metastores. If you have a pre-deletion backup of metastore metadata (e.g. viaDESCRIBE HISTORY or versioned storage), use that to recreate your objects. Otherwise - contact Databricks support directly and they might be able to assist with a partial recovery. See https://docs.databricks.com/aws/en/data-governance/unity-catalog/manage-metastore.

2. Each Delta table's _delta_log contains JSON files with metadata. Look for metaData actions in these files, which include the name field identifying the table.

  • Example workflow:

    1. List all directories under the old Azure storage container.

    2. For each UUID directory:

      dbutils.fs.ls("abfss://<container>@<storage>.dfs.core.windows.net/<UUID>/_delta_log/")
    3. Inspect the most recent JSON log file for "commitInfo"."operationParameters"."name".

Critical Considerations: Managed table data from the deleted metastore is permanently removed after 30 days. If tables were external, their data remains intact in storage, but you'll need to reregister them in the new metastore using their original paths.

TugrulA
New Contributor II

Hey Sarahbhord, 

first of all thank you. One step I solved. I can now access all metadata and the data itself ( we have a issue on vnet settings) and regarding your response this is correct. We haven't chance to attach to new metastore ( for me this is very bad solution). But I haven't one big issue, this would help me. Databricks don't save the table names. But I have lots of metadata, I can't discover of each one. If I have for each metatadata as well as the table names, this would make my restore concept so easy