cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DAB multiple workspaces

samtech
New Contributor

Hi,

We have 3 regional workspaces. Assume that we keep seperate folder for notebook say amer/xx , apac/xx, emea/xx and sepeate job/pipeline configrations for each region in git how to make sure during deploy appropriate job/pipleines are deployed in resepctive workspaces.  

1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @samtech,

  • Define separate bundle configuration files for each region. These configuration files will specify the resources (notebooks, jobs, pipelines) and their respective paths. For example, you can have amer_bundle.yml, apac_bundle.yml, and emea_bundle.yml.1
  • In each bundle configuration file, define the resources specific to that region. Here is an example for the amer region:

    bundle:

targets:
amer:
workspace:
host: https://<amer-workspace-url>
profile: amer_profile
apac:
workspace:
host: https://<apac-workspace-url>
profile: apac_profile
emea:
workspace:
host: https://<emea-workspace-url>
profile: emea_profile

  • Use the Databricks CLI to deploy the bundles to the respective workspaces. For example, to deploy the amer bundle:

databricks bundle deploy --bundle-file amer_bundle.yml --target amer

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group