cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Parity between Spark.sql and SQL Warehouse for Metric Views & Model Visualization

smpa011
New Contributor II

Hi everyone,

I’m exploring the new Databricks Metric Views (Semantic Layer) and have two questions regarding programmatic management and UI visualization.

1. Parser Disparity: spark.sql vs. SQL Warehouse

I'm noticing that CREATE OR REPLACE VIEW ... WITH METRICS fails with a PARSE_SYNTAX_ERROR when executed via spark.sql() in a notebook, but works perfectly when run in a SQL Warehouse.

Is this architectural limitation by design? Are there plans to incorporate the Metric View DDL into the standard Spark parser so we can manage these programmatically via PySpark?

# This fails on standard clusters but is what I'd like to achieve:
spark.sql("""
CREATE OR REPLACE VIEW sales_metrics
WITH METRICS
LANGUAGE YAML
AS $$
version: 1.1
source: catalog.schema.fact_sales
joins:
  - name: dim_customer
    source: catalog.schema.fact_sales.dim_customer
    on: source.customer_id = dim_customer.customer_id
measures:
  - name: total_amount
    expr: sum(amount)
$$
""")

2. Graphical Data Model Visualization

Coming from a Power BI/SSAS background, I am looking for a way to visualize the relationships defined in the Metric View's YAML (the Star Schema) graphically.

  • Is there a way to view an Entity Relationship Diagram (ERD) for Metric Views within Catalog Explorer today?

  • If not, is a graphical "Model View" on the roadmap to help verify complex relationships and join logic visually?

Thanks in advance for the help!

2 REPLIES 2

smpa011
New Contributor II

To add to #1 above

The following

```sql
CREATE OR REPLACE VIEW catalog.schema.customer_sales_metric_view
-------
```


**Works**: SQL Warehouse/Editor
**Fails**: PySpark spark.sql()
**Fails**: Notebook %sql Magic

Business Impact
This limitation significantly impacts:
- Infrastructure as Code: Cannot version-control and deploy metric view definitions programmatically
- Automated Testing: Cannot create ephemeral metric views for testing purposes
- Development Workflow: Requires context switching between notebook development and SQL Warehouse execution

Alternative APIs
Also, is there a programmatic API for creating metric views that I'm missing? The current limitation forces hybrid workflows with manual steps.

The ability to programmatically manage metric views would greatly enhance the developer experience and enable more sophisticated data platform automation.

sandy_123
Visitor

Hi @smpa011,

Metric views work on both SQL warehouse and classic clusters, but the cluster must be running DBR 17.2 or above. The error you are getting is because you may be using DBR version below 17.2 or so. try upgrading your DBR version or use SQL warehouse(classic, pro or serverless). check prerequisites in below link.

Regarding you 2nd question i think current there metric views don't offer ERD -style graphical visualization of yaml structure itself but the closest you get is lineage graph showing upstream/downstream dependencies.

Hope this clarifies!

Use SQL to create and manage metric views | Databricks on AWS