How Databricks Metric Views Are Replacing Power BI Import Models — and What Your Team Needs to Do About It.
Introduction
Power BI Import models work — until scheduled refreshes, size limits, and governance sprawl become too big to ignore. Databricks Unity Catalog Metric Views, connected via Direct Query, offer a compelling alternative: real-time data, no size caps, and centralized governance.
This post walks you through everything you need to evaluate before making this move: what changes, what doesn’t, and how to approach the migration in a way that doesn’t break what you’ve already built.
What Are Databricks Metric Views?
Metric Views are a centralized semantic layer inside Unity Catalog. You define dimensions, measures, and business logic once — in Databricks — and expose it to any consuming tool.
- Logic lives in Databricks, not Power BI
- Consumed by Power BI, Tableau, or any SQL-compatible tool
- Governed and secured at the source via Unity Catalog
The key difference from a Power BI semantic model is where the intelligence sits. With a traditional Import model, Power BI owns the logic: relationships, DAX measures, calculated columns. With Metric Views, Databricks owns it. Power BI becomes more of a visualization layer on top.
Creating and Managing Metric Views in Databricks
Metric Views can be created in two ways:
Via YAML (Code-first approach)
Define your Metric View in a YAML file and deploy via the Databricks CLI or CI/CD pipeline. This is the recommended approach for version-controlled, team-managed environments.

Via Databricks UI (No-code approach)
- Navigate to Unity Catalog → select your catalog and schema
- Click Create → Metric View
- Define dimensions and measures using the visual form editor
- Save and publish — the view is immediately available to connected tools
Governance notes ownership shifts from the Power BI developer to the Databricks platform or data engineering team. Plan this handoff explicitly.

Connecting Metric Views to Power BI via Direct Query
- Use the Databricks connector in Power BI Desktop
- Point it at your SQL Warehouse endpoint
- Select the Metric Views to include in your dataset
- Every report interaction sends a live SQL query to Databricks — no data is cached in Power BI


Direct Query vs Import Mode: The Core Differences

The shift from scheduled refresh to real-time queries sounds like a pure upgrade — and in many ways it is. But it also means your SQL Warehouse needs to be available and appropriately sized at all times, not just during refresh windows.
Impact on Your Existing Power BI Reports
a lot of what you’ve already built will survive the migration without changes.
What stays the same
- All visual types — charts, tables, cards, matrices — continue to work as-is
- Filters, slicers, and cross-filter behavior are unaffected
- Report layouts and page designs carry over without modification
- Dashboards that pin visuals from reports continue to function
What may need attention
- Relationships — may need redesign based on Metric View structure
- Calculated columns — must move to the Databricks SQL layer
- Complex DAX — iterator functions and context transitions may break
- Custom aggregations — may conflict with Metric View definitions
Impact on Measures, Relationships, and Calculated Columns
- Measures: Simple SUM/COUNT DAX works. Complex logic should move into Databricks metric definitions.
- Relationships: Validate against the Metric View structure. Many-to-many and bridge tables need careful review.
- Calculated columns: Push all column-level logic into Databricks SQL — this is the most common migration pain point.
Security: Migrating Row-Level Security

- Before: RLS enforced inside Power BI at the dataset level — scoped to Power BI only
- After: Unity Catalog row filters enforced at the source — applies to all tools
- Benefits: centralized, auditable, reusable across Tableau, notebooks, and APIs
Performance Considerations with Direct Query
Performance is the most common concern teams raise when considering this move, and it’s a legitimate one. Here’s what actually drives it:

Risks and How to Mitigate Them

Migration Recommendation: A Phased Approach
- Step 1 — Validate: Audit DAX, RLS, calculated columns, and relationships
- Step 2 — Pilot: Migrate one non-critical dataset end-to-end; validate reports and security
- Step 3 — Parallel testing: Run old and new in parallel; compare output and performance with real users
- Step 4 — Production rollout: Migrate datasets from lowest to highest complexity; decommission Import models only after stability is confirmed
Final thought
Moving from Power BI Import models to Databricks Metric Views via Direct Query is genuinely worth considering — the governance, scalability, and real-time data benefits are real. But it’s an architectural shift, not just a configuration change. Teams that approach it with a clear audit, a phased plan, and cross-functional alignment tend to come out ahead. Teams that underestimate the semantic layer redesign and the security migration usually don’t.