cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Asset Bundles capability for cross cloud migration

abhijit007
New Contributor III

Hi everyone,

We are planning a migration from Azure Databricks to GCP Databricks and would like to understand whether Databricks Asset Bundles (DAB) can be used to migrate workspace assets such as jobs, pipelines, notebooks, and custom serving endpoints.

Specifically, looking for clarity on the following points:

  1. How does DAB handle cross-cloud migrations, particularly when moving across different metastores?

  2. How is classic compute recreated during migration, considering the underlying driver and worker infrastructure differs between Azure and GCP?

  3. If DAB is not suitable, are there any Databricks-native tools or recommended approaches for this type of workspace migration (for example, Databricks Terraform Resource Exporter)?

Appreciate any guidance or shared experiences.

1 ACCEPTED SOLUTION

Accepted Solutions

iyashk-DB
Databricks Employee
Databricks Employee

DAB's are useful but not sufficient. They work well for re-creating control-plane assets such as jobs, notebooks, DLT/Lakeflow pipelines, and model serving endpoints in a target workspace, even across clouds, by using environment-specific targets and variables.

1) DAB does not migrate Unity Catalog metastores, data, or tables. Since UC metastores are cloud-scoped, you must create a new GCP metastore, recreate catalogs/schemas/permissions, and migrate data separately (SYNC for external tables, CTAS/DEEP CLONE for managed tables).

2) For classic compute, there is no 1:1 mapping between Azure and GCP VM types. DAB can reference or define compute, but you must remap node types and cloud-specific settings for GCP. In practice, teams export clusters and jobs using the Databricks Terraform Resource Exporter, update node types, storage paths, secrets, and cloud attributes, and then apply them to the GCP workspace.

3)Like I said above, use the Terraform Resource Exporter to lift-and-shift existing workspace assets, migrate UC metadata and data explicitly, and then adopt DAB for ongoing CI/CD to manage jobs, pipelines, and serving endpoints going forward. In short, think of Terraform for the initial migration, and DAB for long-term, clean, multi-environment deployments.

View solution in original post

2 REPLIES 2

iyashk-DB
Databricks Employee
Databricks Employee

DAB's are useful but not sufficient. They work well for re-creating control-plane assets such as jobs, notebooks, DLT/Lakeflow pipelines, and model serving endpoints in a target workspace, even across clouds, by using environment-specific targets and variables.

1) DAB does not migrate Unity Catalog metastores, data, or tables. Since UC metastores are cloud-scoped, you must create a new GCP metastore, recreate catalogs/schemas/permissions, and migrate data separately (SYNC for external tables, CTAS/DEEP CLONE for managed tables).

2) For classic compute, there is no 1:1 mapping between Azure and GCP VM types. DAB can reference or define compute, but you must remap node types and cloud-specific settings for GCP. In practice, teams export clusters and jobs using the Databricks Terraform Resource Exporter, update node types, storage paths, secrets, and cloud attributes, and then apply them to the GCP workspace.

3)Like I said above, use the Terraform Resource Exporter to lift-and-shift existing workspace assets, migrate UC metadata and data explicitly, and then adopt DAB for ongoing CI/CD to manage jobs, pipelines, and serving endpoints going forward. In short, think of Terraform for the initial migration, and DAB for long-term, clean, multi-environment deployments.

abhijit007
New Contributor III

@iyashk-DB Thanks for the details.. It helps.