Hi @Seunghyun,
This is a common architecture question, and there are several approaches depending on your requirements around freshness, governance, and operational overhead. Let me address each of your questions directly and then recommend an overall strategy.
ACCOUNT-LEVEL SHARING (NO MIGRATION REQUIRED)
AI/BI Dashboards support account-level sharing natively. When you publish a dashboard in Workspace A, you can share it with any registered member of your Databricks account, even if they do not have access to Workspace A. This means users in Workspace B can view published dashboards from Workspace A without any export/import step.
To set this up:
1. Publish the dashboard in Workspace A.
2. In the Share dialog, add account-level users or groups by email or group name.
3. Those users receive CAN RUN permissions automatically, giving them view-only access to the published dashboard.
When you publish with "Shared data permissions" (the default), viewers run queries using the publisher's credentials, so they do not need direct access to the underlying data. This is typically the best choice for cross-workspace consumers who only need to view results.
When you publish with "Individual data permissions," each viewer's own credentials are used to query data. This requires that the viewer has access to the underlying Unity Catalog tables, which works well when the data is shared across workspaces via Unity Catalog (since UC is account-level).
Key documentation:
https://docs.databricks.com/aws/en/dashboards/share/share
IDENTITY MANAGEMENT FOR CROSS-WORKSPACE ACCESS
For managing "View Only" access in Workspace B, account-level groups are the recommended approach. Here is why:
1. Create an account-level group (e.g., "dashboard-viewers-team-x") in the account console.
2. Add users from any workspace to that group.
3. Share published dashboards with this group rather than individual users.
This gives you centralized control: when someone joins or leaves the team, you update the group membership once and it propagates to all shared dashboards. Account-level groups work with both workspace-level and account-level sharing, so this approach scales well.
Key documentation:
https://docs.databricks.com/aws/en/admin/users-groups/
UNITY CATALOG AND DASHBOARD DISCOVERABILITY
Unity Catalog permissions govern access to the underlying data (catalogs, schemas, tables), but they do not directly control dashboard discoverability. Dashboards are workspace objects, not Unity Catalog securables. However, the combination of UC and account-level sharing gives you a workable pattern:
- UC ensures that users across workspaces can access the same data (when granted appropriate permissions).
- Account-level dashboard sharing ensures those same users can view the published dashboards without needing workspace access.
There is not currently a "catalog of dashboards" that users can browse across workspaces the way they browse tables in UC. Users need the direct link or it needs to be shared with them explicitly.
CI/CD APPROACH FOR DASHBOARD PROMOTION
If you need the same dashboard deployed to multiple workspaces (for example, each workspace gets its own copy pointing to local data), Databricks Asset Bundles (DABs) provide a clean CI/CD solution:
1. Export the dashboard from the source workspace:
databricks bundle generate dashboard --existing-dashboard-id <dashboard-id>
2. This produces a .lvdash.json file and a YAML configuration. Add these to version control.
3. Define targets in your databricks.yml for each workspace:
bundle:
name: shared-dashboards
resources:
dashboards:
sales_dashboard:
display_name: "Sales Dashboard"
file_path: ./dashboards/sales.lvdash.json
warehouse_id: ${var.warehouse_id}
targets:
workspace_a:
workspace:
host: https://workspace-a.cloud.databricks.com
variables:
warehouse_id: "abc123"
workspace_b:
workspace:
host: https://workspace-b.cloud.databricks.com
variables:
warehouse_id: "def456"
4. Deploy to any target:
databricks bundle deploy --target workspace_b
This keeps a single source of truth in version control and lets you push updates to any workspace with one command. It integrates with any CI/CD system (GitHub Actions, Azure DevOps, Jenkins, etc.).
Key documentation:
https://docs.databricks.com/aws/en/dev-tools/bundles/
REST API ALTERNATIVE
If you prefer a lighter-weight automation without full DABs, the REST API supports export and import directly:
- Export: GET /api/2.0/workspace/export (returns the .lvdash.json)
- Import: POST /api/2.0/workspace/import (with format set to AUTO and path ending in .lvdash.json)
- Publish: POST /api/2.0/lakeview/dashboards/{dashboard_id}/published
You could wrap these in a simple script that pulls from Workspace A and pushes to Workspace B on a schedule or trigger.
Key documentation:
https://docs.databricks.com/aws/en/dashboards/tutorials/workspace-dashboard-api
RECOMMENDED APPROACH
For your scenario (same account, Unity Catalog, view-only consumers in Workspace B), I would recommend starting with account-level sharing:
1. Publish dashboards in Workspace A with shared data permissions.
2. Create account-level groups for your cross-workspace viewers.
3. Share published dashboards with those groups.
This gives you a single source of truth (the dashboard lives in Workspace A only), no export/import overhead, and centralized identity management. If your needs evolve to require independent copies in each workspace, you can layer in Databricks Asset Bundles for CI/CD later.
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.