cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Managing dashboard refresh schedules in DABs

Seunghyun
Contributor

I am currently using Databricks Asset Bundles (DABs) to deploy and manage dashboard resources. While I can manually add a schedule to a dashboard via the Databricks console, I would like to reflect this same configuration in the dashboard YAML file. However, the documentation (https://docs.databricks.com/aws/en/dev-tools/bundles/resources#dashboard) does not mention any keywords related to 'schedule,' and it appears the feature is not supported there.

On the other hand, the tasks documentation (https://docs.databricks.com/aws/en/dev-tools/bundles/job-task-types#dashboard-task) describes a 'dashboard task' that seems to handle refreshing. Is this the only way to automate dashboard refreshes through bundles?

1 ACCEPTED SOLUTION

Accepted Solutions

Ale_Armillotta
Valued Contributor II

Hi @Seunghyun .

I faced these issue some days ago. By my experience there's no way to refresh the dashboard using dashboard yaml. Also genie isn't supported and in order to deploy a genie space I created a job to deploy it. If you want refresh the dashboard you should redeploy the bundle or if your are using pipelines for CI/CD with Asset Bundle you should create an exception to run a refresh via REST API.

View solution in original post

2 REPLIES 2

Ale_Armillotta
Valued Contributor II

Hi @Seunghyun .

I faced these issue some days ago. By my experience there's no way to refresh the dashboard using dashboard yaml. Also genie isn't supported and in order to deploy a genie space I created a job to deploy it. If you want refresh the dashboard you should redeploy the bundle or if your are using pipelines for CI/CD with Asset Bundle you should create an exception to run a refresh via REST API.

SteveOstrowski
Databricks Employee
Databricks Employee

Hi @Seunghyun,

You are correct that the dashboard resource definition in Databricks Asset Bundles does not currently include schedule-related properties. The dashboard resource supports properties like display_name, file_path, warehouse_id, embed_credentials, parent_path, and permissions, but there is no way to define a refresh schedule directly on the dashboard resource itself.

There are two approaches you can use to automate dashboard refreshes through bundles:

OPTION 1: DASHBOARD TASK IN A LAKEFLOW JOB (RECOMMENDED FOR DABS)

You can define a Lakeflow Job in your bundle YAML that includes a dashboard_task. This lets you schedule dashboard refreshes with full control over timing, retries, and notifications. Here is an example:

resources:
dashboards:
  my-dashboard:
    display_name: "My Dashboard"
    file_path: ./dashboards/my-dashboard.lvdash.json
    warehouse_id: ${var.warehouse_id}

jobs:
  refresh-my-dashboard:
    name: "Refresh My Dashboard"
    schedule:
      quartz_cron_expression: "0 0 8 * * ?"
      timezone_id: "America/New_York"
    tasks:
      - task_key: refresh-dashboard
        dashboard_task:
          dashboard_id: ${resources.dashboards.my-dashboard.id}

The dashboard_task supports these parameters:
- dashboard_id (required): the ID of the published dashboard to refresh
- warehouse_id (optional): SQL warehouse to use; defaults to the dashboard's configured warehouse
- subscription (optional): configure where to send dashboard snapshots after refresh (email, Slack, etc.)

This approach gives you the benefit of the full Lakeflow Jobs scheduling engine, including cron expressions, retries, conditional tasks, notifications, and the ability to chain the dashboard refresh after upstream data pipeline tasks.

OPTION 2: REST API FOR NATIVE DASHBOARD SCHEDULES

If you prefer the built-in dashboard schedule (the same one you configure manually in the UI), you can use the Lakeview REST API to create schedules programmatically. The endpoint is:

POST /api/2.0/lakeview/dashboards/{dashboard_id}/schedules

This creates a native refresh schedule on the dashboard. However, this API call is not directly supported as a declarative property in the DABs dashboard resource schema. You would need to make this API call separately (for example, via a post-deployment script or a notebook task).

WHICH APPROACH TO USE

For most DABs workflows, Option 1 (the dashboard_task inside a job) is the better fit. It keeps everything declarative within your bundle YAML, integrates with your deployment lifecycle, and allows you to sequence the dashboard refresh after upstream ETL jobs complete. This is also the pattern the Databricks documentation recommends for workflow orchestration of dashboard refreshes.

Relevant documentation:
- Dashboard task types in bundles: https://docs.databricks.com/en/dev-tools/bundles/job-task-types.html
- Dashboard tasks in Lakeflow Jobs: https://docs.databricks.com/aws/en/jobs/dashboard
- Schedule and subscribe to dashboards (UI/API): https://docs.databricks.com/aws/en/dashboards/share/schedule-subscribe
- Dashboard resource in bundles: https://docs.databricks.com/en/dev-tools/bundles/resources.html

* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.

If this answer resolves your question, could you mark it as "Accept as Solution"? That helps other users quickly find the correct fix.