Hello Databricks Community,
I am currently facing an issue while trying to export and import all catalogs from an old workspace to a newly created workspace using Databricks' REST API and Python. Here's a summary of my situation:
Objective: I am attempting to export all catalogs, including 'hive_metastore', from an old workspace and import them into a newly created workspace using Python scripts and Databricks REST API.
Current Status: I have successfully exported and imported all catalogs except for 'hive_metastore' using the following API endpoint:
endpoint_url = f"{workspace_url}/api/2.1/unity-catalog/catalogs"
Issue: Despite using the above endpoint, I have not been able to export or import the 'hive_metastore' catalog. I am specifically looking for guidance on how to achieve this using the Databricks REST API and Python.
Additional Query: If exporting and importing the 'hive_metastore' catalog via the REST API is not feasible, what alternative methods or approaches can be used to achieve this task?
Could someone please provide insights or example code snippets on how to properly handle the export and import of the 'hive_metastore' catalog between Databricks workspaces using either the REST API or alternative methods?
Any documentation links, specific API endpoints related to 'hive_metastore', or alternative approaches would be greatly appreciated.
Thank you