Hello,
For the purposes of testing I'm interested in creating a new workspace with Unity Catalog enabled, and from there I'd like to access (external - S3) tables on an existing legacy hive metastore workspace (not UC enabled). The goal is for both workspaces would point to the same underlying S3 external location.
As a requirement I do not want to duplicate data & ideally updates to data on the legacy workspace would be reflected to tables surfaced through UC.
I was considering the possibility of shallow cloning, however from my understanding that is not possible across UC & hive metastore.
Does anybody have experience/recommendations on doing this? Looking through some databricks documentation I'm mostly finding information on upgrading a legacy workspace only.
#unitycatalog #hivemetastore