cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Trying to Backup Dashboards and Queries from our Workspace.

pranav5
New Contributor II

We are using a databricks workspace and our IT team is decommissioning it as our time with it is being done. I have many queries and dashboards developed. I want to copy these, unfortunately when i download using zip or .dbc these queries or dashboards are not being downloaded.

Is there a way I could do this so that once we have budget again, we could get a new workspace provisioned and I could just use these assets created. This is a bit of a priority for us as the deadline is Wednesday 11/12, sorry this is last minute but we never realized that this issue would pop up.

Your help on this would be really appreciated, I want to back my user and another user, user1@example.com, user2@example.com

TIA.

1 REPLY 1

bianca_unifeye
New Contributor II

Notebooks

These are the easiest assets to back up.
You can export them individually or in bulk as:

  • .dbc – Databricks archive format (can re-import directly into a new workspace)

  • .source or .py – raw code export (ideal for version control)

To download in bulk:

  1. Navigate to your Workspace folder in Databricks.

  2. Right-click → Export → choose format.

  3. Store in your shared drive or Git repository.

Queries and Dashboards (Databricks SQL Editor / SQL Warehouses)

These are not included in .dbc or workspace ZIP exports.
They are stored separately in Databricks SQL (in the metastore).

You must export them manually or via the SQL REST API:

  • Open each Query → Copy SQL → Save as .sql file.

  • For Dashboards, capture also screenshots and export definitions using /api/2.0/preview/sql/dashboards.

  • Store both in a structured folder (e.g., /sql_exports/queries and /sql_exports/dashboards).

I am assuming all dashboards and queries were created manually in the workspace UI and not deployed via Databricks Asset Bundles (DAB) or automation.

That means they exist only within the workspace metadata, if not backed up before decommissioning, they cannot be recovered later.

Recommendation for the New Environment

Once the new workspace is provisioned:

Adopt Databricks Asset Bundles (DAB)

  • Use DAB to automate deployments of notebooks, jobs, dashboards, and SQL assets from code.

  • It provides versioning, repeatable deployments, and environment isolation (Dev → Test → Prod).

  • All configuration lives as code in YAML — meaning if you lose a workspace, you can rehydrate it in minutes.

Reference: Databricks Asset Bundles Documentation

Move Code to GitHub (or Azure DevOps)

  • Every notebook, pipeline, or SQL query should live in source control.

  • Avoid storing logic exclusively inside the Databricks workspace.

  • Use Git integration to sync notebooks automatically this ensures version tracking, peer review, and rollback capability.

Avoid Manual Notebooks in Production

  • Use notebooks only for experimentation or prototyping.

  • For production, all code should be deployed as:

    • Workflows (Jobs)

    • DAB-based pipelines

    • Versioned repos with CI/CD integration

This reduces drift, supports automated testing, and makes your environment reproducible. Hope this will help you!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now