Hi @Seunghyun,
You are correct that the dashboard resource definition in Databricks Asset Bundles does not currently include schedule-related properties. The dashboard resource supports properties like display_name, file_path, warehouse_id, embed_credentials, parent_path, and permissions, but there is no way to define a refresh schedule directly on the dashboard resource itself.
There are two approaches you can use to automate dashboard refreshes through bundles:
OPTION 1: DASHBOARD TASK IN A LAKEFLOW JOB (RECOMMENDED FOR DABS)
You can define a Lakeflow Job in your bundle YAML that includes a dashboard_task. This lets you schedule dashboard refreshes with full control over timing, retries, and notifications. Here is an example:
resources:
dashboards:
my-dashboard:
display_name: "My Dashboard"
file_path: ./dashboards/my-dashboard.lvdash.json
warehouse_id: ${var.warehouse_id}
jobs:
refresh-my-dashboard:
name: "Refresh My Dashboard"
schedule:
quartz_cron_expression: "0 0 8 * * ?"
timezone_id: "America/New_York"
tasks:
- task_key: refresh-dashboard
dashboard_task:
dashboard_id: ${resources.dashboards.my-dashboard.id}
The dashboard_task supports these parameters:
- dashboard_id (required): the ID of the published dashboard to refresh
- warehouse_id (optional): SQL warehouse to use; defaults to the dashboard's configured warehouse
- subscription (optional): configure where to send dashboard snapshots after refresh (email, Slack, etc.)
This approach gives you the benefit of the full Lakeflow Jobs scheduling engine, including cron expressions, retries, conditional tasks, notifications, and the ability to chain the dashboard refresh after upstream data pipeline tasks.
OPTION 2: REST API FOR NATIVE DASHBOARD SCHEDULES
If you prefer the built-in dashboard schedule (the same one you configure manually in the UI), you can use the Lakeview REST API to create schedules programmatically. The endpoint is:
POST /api/2.0/lakeview/dashboards/{dashboard_id}/schedules
This creates a native refresh schedule on the dashboard. However, this API call is not directly supported as a declarative property in the DABs dashboard resource schema. You would need to make this API call separately (for example, via a post-deployment script or a notebook task).
WHICH APPROACH TO USE
For most DABs workflows, Option 1 (the dashboard_task inside a job) is the better fit. It keeps everything declarative within your bundle YAML, integrates with your deployment lifecycle, and allows you to sequence the dashboard refresh after upstream ETL jobs complete. This is also the pattern the Databricks documentation recommends for workflow orchestration of dashboard refreshes.
Relevant documentation:
- Dashboard task types in bundles: https://docs.databricks.com/en/dev-tools/bundles/job-task-types.html
- Dashboard tasks in Lakeflow Jobs: https://docs.databricks.com/aws/en/jobs/dashboard
- Schedule and subscribe to dashboards (UI/API): https://docs.databricks.com/aws/en/dashboards/share/schedule-subscribe
- Dashboard resource in bundles: https://docs.databricks.com/en/dev-tools/bundles/resources.html
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.
If this answer resolves your question, could you mark it as "Accept as Solution"? That helps other users quickly find the correct fix.