cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Disaster Recovery Issue

karthik_p
Esteemed Contributor

We are trying to create Disaster Recovery for UC enabled Workspaces in Azure. our UC metastore are in different regions.

1. we are trying to use Deep Clone

2. In source we are adding region2 metastore as external location

3. able to do deep clone

problem is table is being converted as external, as we are using location in Deep clone. but when we point target metastore as external location data is getting migrated and when we try to convert table back into managed in target it is not allowing to map and convert as metastore is capable of storing manged data not external data.

do we need to get involved with one more external location and DEEP clone into that and eventually make that as external location in target and create external table in target metastore and convert back to managed.

do we have any other solution for DR other than DEEP Clone which is recommended 

3 REPLIES 3

-werners-
Esteemed Contributor III

another option is to copy the physical files.  But for delta lake tables that is suboptimal as you also copy the deprecated data.  But it is an option.

karthik_p
Esteemed Contributor

@-werners- ya but here we have most of tables as managed, problem is deep clone is not behaving as needed.

eg: we are deep clone from source to target (create or replace table clone1 deep clone table location ""), it is exporting table data into location provided (eg: 2nd region metatstore)

but when we run ddl statement in target, it is only creating empty table, data is not being mapped for managed table 

-werners-
Esteemed Contributor III

Right I get it.
Actually cloning it as external seems logical to me, for the moment, as unity cannot manage the other metastore.
For the moment I would go with cloning the data and then creating an external table of that.
Not ideal, but at least you have a backup of the data.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group