Hi everyone,
We are planning a migration from Azure Databricks to GCP Databricks and would like to understand whether Databricks Asset Bundles (DAB) can be used to migrate workspace assets such as jobs, pipelines, notebooks, and custom serving endpoints.
Specifically, looking for clarity on the following points:
How does DAB handle cross-cloud migrations, particularly when moving across different metastores?
How is classic compute recreated during migration, considering the underlying driver and worker infrastructure differs between Azure and GCP?
If DAB is not suitable, are there any Databricks-native tools or recommended approaches for this type of workspace migration (for example, Databricks Terraform Resource Exporter)?
Appreciate any guidance or shared experiences.