Hi @smpa011,
METRIC VIEW DDL ON ALL-PURPOSE COMPUTE (spark.sql / %sql)
The CREATE VIEW ... WITH METRICS DDL requires Databricks Runtime 17.2 or above. This applies to both SQL warehouses and all-purpose clusters. If your notebook is attached to a cluster running an older DBR version, the Spark SQL parser will not recognize the WITH METRICS clause and will throw a PARSE_SYNTAX_ERROR.
To fix this:
1. Check your cluster's DBR version. Navigate to the cluster configuration page and confirm it is running DBR 17.2 or later.
2. If it is on an older runtime, edit the cluster and select Databricks Runtime 17.2+ from the runtime dropdown, then restart the cluster.
3. Once you are on DBR 17.2+, the same CREATE OR REPLACE VIEW ... WITH METRICS ... LANGUAGE YAML DDL should work in spark.sql(), %sql magic commands, and the SQL editor attached to a SQL warehouse.
The prerequisite from the documentation states: you need "CAN USE permissions on a SQL warehouse or other compute resource running Databricks Runtime 17.2 or above."
Reference: https://docs.databricks.com/aws/en/metric-views/create/sql
So this is not an intentional limitation that restricts metric view DDL to SQL warehouses only. Once your compute is on DBR 17.2+, you can manage metric views programmatically through spark.sql() in notebooks, which fully supports infrastructure-as-code and automated testing workflows.
Additionally, the Databricks REST API and the manage_metric_views tool in the Databricks AI Dev Kit also support creating, altering, describing, and querying metric views programmatically if you want a non-SQL approach.
DATA MODEL / ERD VISUALIZATION
Currently, Catalog Explorer does not provide a dedicated ERD-style visualization for metric views or their join relationships. However, there are a few related capabilities worth noting:
1. Lineage Graph: In Catalog Explorer, you can navigate to your metric view and open the Lineage tab. This shows an interactive graph of upstream sources (tables, views) and downstream consumers (dashboards, notebooks, jobs). You can click nodes to expand connections and drill into column-level lineage as well.
Reference: https://docs.databricks.com/aws/en/data-governance/unity-catalog/data-lineage
2. Metric View Overview Page: After creating a metric view through the Catalog Explorer UI, the overview page displays the source, filter, and all specified measures and dimensions. You can also view and edit the underlying YAML definition directly.
Reference: https://docs.databricks.com/aws/en/metric-views/create/
3. DESCRIBE TABLE EXTENDED ... AS JSON: For programmatic inspection of the metric view definition, including all joins, dimensions, and measures, you can run:
DESCRIBE TABLE EXTENDED catalog.schema.your_metric_view AS JSON
The lineage graph is the closest feature to an ERD today. It visualizes the relationships between your metric view and the underlying source tables/views, though it does not render the star/snowflake schema join structure defined in your YAML specification as a traditional ERD.
SUMMARY
- Upgrade your cluster to DBR 17.2+ and the metric view DDL will work in spark.sql() and %sql just as it does on a SQL warehouse.
- For data model visualization, use the Lineage tab in Catalog Explorer to see upstream/downstream relationships.
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.
If this answer resolves your question, could you mark it as "Accept as Solution"? That helps other users quickly find the correct fix.