cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Best Practice for Sharing AI/BI Dashboards across Workspaces in the same Account

Seunghyun
Contributor

Hello everyone,

I’m looking for the most efficient way to share dashboards between two workspaces (Workspace A and Workspace B) within the same Databricks account.

[Current Setup]

  • Account: Single account with two workspaces (A and B).

  • Data Governance: Both workspaces share the same catalogs via Unity Catalog.

  • Goal: I want to create a dashboard in Workspace A and have users in Workspace B view it with the same permissions, but without the manual overhead of exporting/importing (JSON/DBC).

[What I’ve tried] I’ve already tested the manual Export and Import process. While it works, it’s too tedious for frequent updates and doesn’t maintain a "single source of truth." If the dashboard in Workspace A is updated, I have to re-export it to Workspace B every time.

[Questions]

  1. Is there a way to publish a dashboard to the Account level so that it’s accessible from Workspace B's dashboard menu without physical migration?

  2. If the users in Workspace B need "View Only" access, what is the best way to handle Identity Management? Should the dashboard be shared with an Account-level group?

  3. Is there a way to leverage Unity Catalog's permissions to make the dashboard object itself discoverable across workspaces, similar to how we share tables?

I would appreciate any advice on the most "automated" or "integrated" workflow for this. I'm trying to avoid manual file movements as much as possible.

Thanks in advance for your help!


 

2 REPLIES 2

MoJaMa
Databricks Employee
Databricks Employee

You don't need to move it from A to B.

Just publish it so that all users in your Account can interact with it. (This would fulfill your Req #1 and#2)

https://www.databricks.com/blog/sharing-aibi-dashboards

[3] Discoverability is a slightly different problem because it will not show up in a "search" in Workspace B today.

But there are future plans to achieve such discoverability through Databricks One.

https://www.databricks.com/blog/introducing-databricks-one

(Imagine that in future all Genie Spaces, Dashboard, Apps in an Account are all discoverable through Databricks One regardless of which workspace they exist in)

 

SteveOstrowski
Databricks Employee
Databricks Employee

Hi @Seunghyun,

This is a common architecture question, and there are several approaches depending on your requirements around freshness, governance, and operational overhead. Let me address each of your questions directly and then recommend an overall strategy.


ACCOUNT-LEVEL SHARING (NO MIGRATION REQUIRED)

AI/BI Dashboards support account-level sharing natively. When you publish a dashboard in Workspace A, you can share it with any registered member of your Databricks account, even if they do not have access to Workspace A. This means users in Workspace B can view published dashboards from Workspace A without any export/import step.

To set this up:

1. Publish the dashboard in Workspace A.
2. In the Share dialog, add account-level users or groups by email or group name.
3. Those users receive CAN RUN permissions automatically, giving them view-only access to the published dashboard.

When you publish with "Shared data permissions" (the default), viewers run queries using the publisher's credentials, so they do not need direct access to the underlying data. This is typically the best choice for cross-workspace consumers who only need to view results.

When you publish with "Individual data permissions," each viewer's own credentials are used to query data. This requires that the viewer has access to the underlying Unity Catalog tables, which works well when the data is shared across workspaces via Unity Catalog (since UC is account-level).

Key documentation:
https://docs.databricks.com/aws/en/dashboards/share/share


IDENTITY MANAGEMENT FOR CROSS-WORKSPACE ACCESS

For managing "View Only" access in Workspace B, account-level groups are the recommended approach. Here is why:

1. Create an account-level group (e.g., "dashboard-viewers-team-x") in the account console.
2. Add users from any workspace to that group.
3. Share published dashboards with this group rather than individual users.

This gives you centralized control: when someone joins or leaves the team, you update the group membership once and it propagates to all shared dashboards. Account-level groups work with both workspace-level and account-level sharing, so this approach scales well.

Key documentation:
https://docs.databricks.com/aws/en/admin/users-groups/


UNITY CATALOG AND DASHBOARD DISCOVERABILITY

Unity Catalog permissions govern access to the underlying data (catalogs, schemas, tables), but they do not directly control dashboard discoverability. Dashboards are workspace objects, not Unity Catalog securables. However, the combination of UC and account-level sharing gives you a workable pattern:

- UC ensures that users across workspaces can access the same data (when granted appropriate permissions).
- Account-level dashboard sharing ensures those same users can view the published dashboards without needing workspace access.

There is not currently a "catalog of dashboards" that users can browse across workspaces the way they browse tables in UC. Users need the direct link or it needs to be shared with them explicitly.


CI/CD APPROACH FOR DASHBOARD PROMOTION

If you need the same dashboard deployed to multiple workspaces (for example, each workspace gets its own copy pointing to local data), Databricks Asset Bundles (DABs) provide a clean CI/CD solution:

1. Export the dashboard from the source workspace:

databricks bundle generate dashboard --existing-dashboard-id <dashboard-id>

2. This produces a .lvdash.json file and a YAML configuration. Add these to version control.

3. Define targets in your databricks.yml for each workspace:

bundle:
name: shared-dashboards

resources:
dashboards:
sales_dashboard:
display_name: "Sales Dashboard"
file_path: ./dashboards/sales.lvdash.json
warehouse_id: ${var.warehouse_id}

targets:
workspace_a:
workspace:
host: https://workspace-a.cloud.databricks.com
variables:
warehouse_id: "abc123"
workspace_b:
workspace:
host: https://workspace-b.cloud.databricks.com
variables:
warehouse_id: "def456"

4. Deploy to any target:

databricks bundle deploy --target workspace_b

This keeps a single source of truth in version control and lets you push updates to any workspace with one command. It integrates with any CI/CD system (GitHub Actions, Azure DevOps, Jenkins, etc.).

Key documentation:
https://docs.databricks.com/aws/en/dev-tools/bundles/


REST API ALTERNATIVE

If you prefer a lighter-weight automation without full DABs, the REST API supports export and import directly:

- Export: GET /api/2.0/workspace/export (returns the .lvdash.json)
- Import: POST /api/2.0/workspace/import (with format set to AUTO and path ending in .lvdash.json)
- Publish: POST /api/2.0/lakeview/dashboards/{dashboard_id}/published

You could wrap these in a simple script that pulls from Workspace A and pushes to Workspace B on a schedule or trigger.

Key documentation:
https://docs.databricks.com/aws/en/dashboards/tutorials/workspace-dashboard-api


RECOMMENDED APPROACH

For your scenario (same account, Unity Catalog, view-only consumers in Workspace B), I would recommend starting with account-level sharing:

1. Publish dashboards in Workspace A with shared data permissions.
2. Create account-level groups for your cross-workspace viewers.
3. Share published dashboards with those groups.

This gives you a single source of truth (the dashboard lives in Workspace A only), no export/import overhead, and centralized identity management. If your needs evolve to require independent copies in each workspace, you can layer in Databricks Asset Bundles for CI/CD later.

* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.