cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks Genie dashboard promote from Genie TEST workspace to another

Subra2025
New Contributor II

Hi,

We have manually migrated powerBI dashboard components to Databricks Genie TEST workspace.

what is the procedure or approaches to promote these Genie TEST workspace components to Databricks Genie PROD workspace?

Thanks

Subra

3 REPLIES 3

MoJaMa
Databricks Employee
Databricks Employee

Subra2025
New Contributor II

Thank you MojaMa, Can we do this migration/promotion of TEST workspace Genie to PROD workspace using Azure DevOps CI/CD? If yes, Can you let me know the Azure repo structure and what type data or code will stored into the repo and how?

 

SteveOstrowski
Databricks Employee
Databricks Employee

Hi @Subra2025,

There are two separate things to consider here: promoting AI/BI Dashboards and promoting Genie Spaces. They have different promotion paths, so I will cover both.


PROMOTING AI/BI DASHBOARDS ACROSS WORKSPACES

AI/BI Dashboards have mature CI/CD support through multiple approaches:

1. Databricks Asset Bundles (DABs): Dashboards are a supported resource type in DABs. You can use "databricks bundle generate" to export an existing dashboard to a .lvdash.json file and a YAML configuration, then deploy it to different targets (dev, staging, prod) using "databricks bundle deploy --target prod". This integrates cleanly with Azure DevOps or any CI/CD system.

Example bundle configuration in databricks.yml:

bundle:
name: my-dashboards

resources:
dashboards:
my_dashboard:
display_name: "My Dashboard"
file_path: ./dashboards/my_dashboard.lvdash.json
warehouse_id: ${var.warehouse_id}

targets:
dev:
workspace:
host: https://adb-dev.azuredatabricks.net
prod:
workspace:
host: https://adb-prod.azuredatabricks.net

2. Import/Export: You can export dashboards as .lvdash.json files from the UI or API, store them in a Git repo, and import them into another workspace. The REST API endpoints under /api/2.0/lakeview/dashboards support get, create, and update operations.

3. Git Folders: You can version-control your .lvdash.json files in Databricks Git folders and use branch-based workflows for dev-to-prod promotion.

Documentation reference:
https://docs.databricks.com/en/dashboards/index.html


PROMOTING GENIE SPACES ACROSS WORKSPACES

Genie Spaces do not yet have native support in Databricks Asset Bundles (there is an open feature request for this). However, the Genie Management REST API does support programmatic export and import, which can be used with CI/CD.

The key API concept is the "serialized_space" field. When you call GetSpace with include_serialized_space=True, you get a portable JSON representation of your Genie Space that includes:

- Sample questions and settings
- Data source (table) configurations
- Instructions and context rules

This serialized_space can be passed directly to CreateSpace or UpdateSpace in another workspace to recreate or update the space.

Here is the general approach for CI/CD with Azure DevOps:

Step 1: Export the Genie Space from the source workspace using the Python SDK:

from databricks.sdk import WorkspaceClient

source_client = WorkspaceClient(
host="https://adb-source.azuredatabricks.net",
# authentication via service principal or token
)

space = source_client.genie.get_space(
space_id="<source-space-id>",
include_serialized_space=True
)

# Save the serialized space config to a JSON file
import json
config = {
"title": space.title,
"description": space.description,
"warehouse_id": "<target-warehouse-id>",
"serialized_space": space.serialized_space,
"table_identifiers": [t.table_identifier for t in space.table_identifiers]
}
with open("genie_space_export.json", "w") as f:
json.dump(config, f, indent=2)

Step 2: Store the exported JSON file in your Azure DevOps Git repository.

Step 3: Deploy to the target workspace in your CI/CD pipeline:

from databricks.sdk import WorkspaceClient
import json

target_client = WorkspaceClient(
host="https://adb-prod.azuredatabricks.net",
# authentication via service principal
)

with open("genie_space_export.json") as f:
config = json.load(f)

# To create a new space:
new_space = target_client.genie.create_space(
title=config["title"],
description=config["description"],
warehouse_id=config["warehouse_id"],
serialized_space=config["serialized_space"],
table_identifiers=config["table_identifiers"]
)
print(f"Created space: {new_space.space_id}")

# Or to update an existing space:
target_client.genie.update_space(
space_id="<target-space-id>",
title=config["title"],
description=config["description"],
serialized_space=config["serialized_space"],
table_identifiers=config["table_identifiers"]
)


AZURE DEVOPS REPO STRUCTURE

For a combined approach handling both dashboards and Genie Spaces, a repo structure like this works well:

my-databricks-project/
databricks.yml # Asset Bundles config (for dashboards, jobs, etc.)
dashboards/
my_dashboard.lvdash.json # Exported dashboard definitions
genie_spaces/
my_genie_space.json # Exported Genie Space configs
scripts/
export_genie_space.py # Export script
deploy_genie_space.py # Deploy script
azure-pipelines.yml # Azure DevOps pipeline definition

The dashboards/ folder is managed by DABs natively. The genie_spaces/ folder contains the exported JSON configs from the GetSpace API. The scripts/ folder contains Python scripts that use the Databricks SDK to handle Genie Space export and deployment.

There is also a community example on GitHub that demonstrates this pattern using DABs jobs to orchestrate Genie Space migration:
https://github.com/hiydavid/databricks-genai-examples/tree/main/mlops/genie-migration


IMPORTANT NOTES

- The target workspace must have the same Unity Catalog tables accessible (same metastore or federated catalog) for the Genie Space to work correctly after migration.
- You will need to update the warehouse_id to point to a valid SQL warehouse in the target workspace.
- The Genie Space will get a new space_id in the target workspace. Track this ID for subsequent update deployments.
- For authentication in CI/CD, use Service Principal credentials (DATABRICKS_HOST, DATABRICKS_CLIENT_ID, DATABRICKS_CLIENT_SECRET) rather than personal access tokens.

Genie API reference:
https://docs.databricks.com/api/workspace/genie

AI/BI Dashboards documentation:
https://docs.databricks.com/en/dashboards/index.html

Databricks Asset Bundles documentation:
https://docs.databricks.com/en/dev-tools/bundles/resources.html

Let me know if you need more details on any of these steps.

* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.