<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Hi @smpa011, METRIC VIEW DDL ON ALL-PURPOSE COMPUTE (spar... in Warehousing &amp; Analytics</title>
    <link>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/150242#M2526</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/157565"&gt;@smpa011&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;METRIC VIEW DDL ON ALL-PURPOSE COMPUTE (spark.sql / %sql)&lt;/P&gt;
&lt;P&gt;The CREATE VIEW ... WITH METRICS DDL requires Databricks Runtime 17.2 or above. This applies to both SQL warehouses and all-purpose clusters. If your notebook is attached to a cluster running an older DBR version, the Spark SQL parser will not recognize the WITH METRICS clause and will throw a PARSE_SYNTAX_ERROR.&lt;/P&gt;
&lt;P&gt;To fix this:&lt;/P&gt;
&lt;P&gt;1. Check your cluster's DBR version. Navigate to the cluster configuration page and confirm it is running DBR 17.2 or later.&lt;BR /&gt;
2. If it is on an older runtime, edit the cluster and select Databricks Runtime 17.2+ from the runtime dropdown, then restart the cluster.&lt;BR /&gt;
3. Once you are on DBR 17.2+, the same CREATE OR REPLACE VIEW ... WITH METRICS ... LANGUAGE YAML DDL should work in spark.sql(), %sql magic commands, and the SQL editor attached to a SQL warehouse.&lt;/P&gt;
&lt;P&gt;The prerequisite from the documentation states: you need "CAN USE permissions on a SQL warehouse or other compute resource running Databricks Runtime 17.2 or above."&lt;/P&gt;
&lt;P&gt;Reference: &lt;A href="https://docs.databricks.com/aws/en/metric-views/create/sql" target="_blank"&gt;https://docs.databricks.com/aws/en/metric-views/create/sql&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;So this is not an intentional limitation that restricts metric view DDL to SQL warehouses only. Once your compute is on DBR 17.2+, you can manage metric views programmatically through spark.sql() in notebooks, which fully supports infrastructure-as-code and automated testing workflows.&lt;/P&gt;
&lt;P&gt;Additionally, the Databricks REST API and the manage_metric_views tool in the Databricks AI Dev Kit also support creating, altering, describing, and querying metric views programmatically if you want a non-SQL approach.&lt;/P&gt;
&lt;P&gt;DATA MODEL / ERD VISUALIZATION&lt;/P&gt;
&lt;P&gt;Currently, Catalog Explorer does not provide a dedicated ERD-style visualization for metric views or their join relationships. However, there are a few related capabilities worth noting:&lt;/P&gt;
&lt;P&gt;1. Lineage Graph: In Catalog Explorer, you can navigate to your metric view and open the Lineage tab. This shows an interactive graph of upstream sources (tables, views) and downstream consumers (dashboards, notebooks, jobs). You can click nodes to expand connections and drill into column-level lineage as well.&lt;BR /&gt;
   Reference: &lt;A href="https://docs.databricks.com/aws/en/data-governance/unity-catalog/data-lineage" target="_blank"&gt;https://docs.databricks.com/aws/en/data-governance/unity-catalog/data-lineage&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;2. Metric View Overview Page: After creating a metric view through the Catalog Explorer UI, the overview page displays the source, filter, and all specified measures and dimensions. You can also view and edit the underlying YAML definition directly.&lt;BR /&gt;
   Reference: &lt;A href="https://docs.databricks.com/aws/en/metric-views/create/" target="_blank"&gt;https://docs.databricks.com/aws/en/metric-views/create/&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;3. DESCRIBE TABLE EXTENDED ... AS JSON: For programmatic inspection of the metric view definition, including all joins, dimensions, and measures, you can run:&lt;/P&gt;
&lt;P&gt;   DESCRIBE TABLE EXTENDED catalog.schema.your_metric_view AS JSON&lt;/P&gt;
&lt;P&gt;The lineage graph is the closest feature to an ERD today. It visualizes the relationships between your metric view and the underlying source tables/views, though it does not render the star/snowflake schema join structure defined in your YAML specification as a traditional ERD.&lt;/P&gt;
&lt;P&gt;SUMMARY&lt;/P&gt;
&lt;P&gt;- Upgrade your cluster to DBR 17.2+ and the metric view DDL will work in spark.sql() and %sql just as it does on a SQL warehouse.&lt;BR /&gt;
- For data model visualization, use the Lineage tab in Catalog Explorer to see upstream/downstream relationships.&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;
&lt;P&gt;If this answer resolves your question, could you mark it as "Accept as Solution"? That helps other users quickly find the correct fix.&lt;/P&gt;</description>
    <pubDate>Sun, 08 Mar 2026 19:07:50 GMT</pubDate>
    <dc:creator>SteveOstrowski</dc:creator>
    <dc:date>2026-03-08T19:07:50Z</dc:date>
    <item>
      <title>Parity between Spark.sql and SQL Warehouse for Metric Views &amp; Model Visualization</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/143938#M2453</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I’m exploring the new &lt;STRONG&gt;Databricks Metric Views (Semantic Layer)&lt;/STRONG&gt; and have two questions regarding programmatic management and UI visualization.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;1. Parser Disparity&lt;/STRONG&gt;: spark.sql vs. SQL Warehouse&lt;/P&gt;&lt;P&gt;I'm noticing that CREATE OR REPLACE VIEW ... WITH METRICS fails with a PARSE_SYNTAX_ERROR when executed via spark.sql() in a notebook, but works perfectly when run in a SQL Warehouse.&lt;/P&gt;&lt;P&gt;Is this architectural limitation by design? Are there plans to incorporate the &lt;STRONG&gt;Metric View DDL&lt;/STRONG&gt; into the standard Spark parser so we can manage these programmatically via PySpark?&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# This fails on standard clusters but is what I'd like to achieve:
spark.sql("""
CREATE OR REPLACE VIEW sales_metrics
WITH METRICS
LANGUAGE YAML
AS $$
version: 1.1
source: catalog.schema.fact_sales
joins:
  - name: dim_customer
    source: catalog.schema.fact_sales.dim_customer
    on: source.customer_id = dim_customer.customer_id
measures:
  - name: total_amount
    expr: sum(amount)
$$
""")&lt;/LI-CODE&gt;&lt;H4&gt;&lt;STRONG&gt;2. Graphical Data Model Visualization&lt;/STRONG&gt;&lt;/H4&gt;&lt;P&gt;Coming from a Power BI/SSAS background, I am looking for a way to visualize the relationships defined in the Metric View's YAML (the Star Schema) graphically.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Is there a way to view an &lt;STRONG&gt;Entity Relationship Diagram (ERD)&lt;/STRONG&gt; for Metric Views within Catalog Explorer today?&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;If not, is a graphical "Model View" on the roadmap to help verify complex relationships and join logic visually?&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks in advance for the help!&lt;/P&gt;</description>
      <pubDate>Tue, 13 Jan 2026 19:30:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/143938#M2453</guid>
      <dc:creator>smpa011</dc:creator>
      <dc:date>2026-01-13T19:30:34Z</dc:date>
    </item>
    <item>
      <title>Re: Parity between Spark.sql and SQL Warehouse for Metric Views &amp; Model Visualization</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/144075#M2457</link>
      <description>&lt;P&gt;To add to #1 above&lt;/P&gt;&lt;P&gt;The following&lt;/P&gt;&lt;P&gt;```sql&lt;BR /&gt;CREATE OR REPLACE VIEW catalog.schema.customer_sales_metric_view&lt;BR /&gt;-------&lt;BR /&gt;```&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;span class="lia-unicode-emoji" title=":white_heavy_check_mark:"&gt;✅&lt;/span&gt; **Works**: SQL Warehouse/Editor&lt;BR /&gt;&lt;span class="lia-unicode-emoji" title=":cross_mark:"&gt;❌&lt;/span&gt; **Fails**: PySpark spark.sql()&lt;BR /&gt;&lt;span class="lia-unicode-emoji" title=":cross_mark:"&gt;❌&lt;/span&gt; **Fails**: Notebook %sql Magic&lt;BR /&gt;&lt;BR /&gt;Business Impact&lt;BR /&gt;This limitation significantly impacts:&lt;BR /&gt;- Infrastructure as Code: Cannot version-control and deploy metric view definitions programmatically&lt;BR /&gt;- Automated Testing: Cannot create ephemeral metric views for testing purposes&lt;BR /&gt;- Development Workflow: Requires context switching between notebook development and SQL Warehouse execution&lt;/P&gt;&lt;P&gt;Alternative APIs&lt;BR /&gt;Also, is there a programmatic API for creating metric views that I'm missing? The current limitation forces hybrid workflows with manual steps.&lt;/P&gt;&lt;P&gt;The ability to programmatically manage metric views would greatly enhance the developer experience and enable more sophisticated data platform automation.&lt;/P&gt;</description>
      <pubDate>Wed, 14 Jan 2026 17:04:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/144075#M2457</guid>
      <dc:creator>smpa011</dc:creator>
      <dc:date>2026-01-14T17:04:32Z</dc:date>
    </item>
    <item>
      <title>Re: Parity between Spark.sql and SQL Warehouse for Metric Views &amp; Model Visualization</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/144232#M2461</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/157565"&gt;@smpa011&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;Metric views work on both SQL warehouse and classic clusters, but the cluster must be running DBR 17.2 or above. The error you are getting is because you may be using DBR version below 17.2 or so. try upgrading your DBR version or use SQL warehouse(classic, pro or serverless). check prerequisites in below link.&lt;/P&gt;&lt;P&gt;Regarding you 2nd question i think current there metric views don't offer ERD -style graphical visualization of yaml structure itself but the closest you get is lineage graph showing upstream/downstream dependencies.&lt;/P&gt;&lt;P&gt;Hope this clarifies!&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/aws/en/metric-views/create/sql" target="_blank"&gt;Use SQL to create and manage metric views | Databricks on AWS&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Jan 2026 11:20:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/144232#M2461</guid>
      <dc:creator>sandy_123</dc:creator>
      <dc:date>2026-01-16T11:20:16Z</dc:date>
    </item>
    <item>
      <title>Hi @smpa011, METRIC VIEW DDL ON ALL-PURPOSE COMPUTE (spar...</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/150242#M2526</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/157565"&gt;@smpa011&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;METRIC VIEW DDL ON ALL-PURPOSE COMPUTE (spark.sql / %sql)&lt;/P&gt;
&lt;P&gt;The CREATE VIEW ... WITH METRICS DDL requires Databricks Runtime 17.2 or above. This applies to both SQL warehouses and all-purpose clusters. If your notebook is attached to a cluster running an older DBR version, the Spark SQL parser will not recognize the WITH METRICS clause and will throw a PARSE_SYNTAX_ERROR.&lt;/P&gt;
&lt;P&gt;To fix this:&lt;/P&gt;
&lt;P&gt;1. Check your cluster's DBR version. Navigate to the cluster configuration page and confirm it is running DBR 17.2 or later.&lt;BR /&gt;
2. If it is on an older runtime, edit the cluster and select Databricks Runtime 17.2+ from the runtime dropdown, then restart the cluster.&lt;BR /&gt;
3. Once you are on DBR 17.2+, the same CREATE OR REPLACE VIEW ... WITH METRICS ... LANGUAGE YAML DDL should work in spark.sql(), %sql magic commands, and the SQL editor attached to a SQL warehouse.&lt;/P&gt;
&lt;P&gt;The prerequisite from the documentation states: you need "CAN USE permissions on a SQL warehouse or other compute resource running Databricks Runtime 17.2 or above."&lt;/P&gt;
&lt;P&gt;Reference: &lt;A href="https://docs.databricks.com/aws/en/metric-views/create/sql" target="_blank"&gt;https://docs.databricks.com/aws/en/metric-views/create/sql&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;So this is not an intentional limitation that restricts metric view DDL to SQL warehouses only. Once your compute is on DBR 17.2+, you can manage metric views programmatically through spark.sql() in notebooks, which fully supports infrastructure-as-code and automated testing workflows.&lt;/P&gt;
&lt;P&gt;Additionally, the Databricks REST API and the manage_metric_views tool in the Databricks AI Dev Kit also support creating, altering, describing, and querying metric views programmatically if you want a non-SQL approach.&lt;/P&gt;
&lt;P&gt;DATA MODEL / ERD VISUALIZATION&lt;/P&gt;
&lt;P&gt;Currently, Catalog Explorer does not provide a dedicated ERD-style visualization for metric views or their join relationships. However, there are a few related capabilities worth noting:&lt;/P&gt;
&lt;P&gt;1. Lineage Graph: In Catalog Explorer, you can navigate to your metric view and open the Lineage tab. This shows an interactive graph of upstream sources (tables, views) and downstream consumers (dashboards, notebooks, jobs). You can click nodes to expand connections and drill into column-level lineage as well.&lt;BR /&gt;
   Reference: &lt;A href="https://docs.databricks.com/aws/en/data-governance/unity-catalog/data-lineage" target="_blank"&gt;https://docs.databricks.com/aws/en/data-governance/unity-catalog/data-lineage&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;2. Metric View Overview Page: After creating a metric view through the Catalog Explorer UI, the overview page displays the source, filter, and all specified measures and dimensions. You can also view and edit the underlying YAML definition directly.&lt;BR /&gt;
   Reference: &lt;A href="https://docs.databricks.com/aws/en/metric-views/create/" target="_blank"&gt;https://docs.databricks.com/aws/en/metric-views/create/&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;3. DESCRIBE TABLE EXTENDED ... AS JSON: For programmatic inspection of the metric view definition, including all joins, dimensions, and measures, you can run:&lt;/P&gt;
&lt;P&gt;   DESCRIBE TABLE EXTENDED catalog.schema.your_metric_view AS JSON&lt;/P&gt;
&lt;P&gt;The lineage graph is the closest feature to an ERD today. It visualizes the relationships between your metric view and the underlying source tables/views, though it does not render the star/snowflake schema join structure defined in your YAML specification as a traditional ERD.&lt;/P&gt;
&lt;P&gt;SUMMARY&lt;/P&gt;
&lt;P&gt;- Upgrade your cluster to DBR 17.2+ and the metric view DDL will work in spark.sql() and %sql just as it does on a SQL warehouse.&lt;BR /&gt;
- For data model visualization, use the Lineage tab in Catalog Explorer to see upstream/downstream relationships.&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;
&lt;P&gt;If this answer resolves your question, could you mark it as "Accept as Solution"? That helps other users quickly find the correct fix.&lt;/P&gt;</description>
      <pubDate>Sun, 08 Mar 2026 19:07:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/parity-between-spark-sql-and-sql-warehouse-for-metric-views-amp/m-p/150242#M2526</guid>
      <dc:creator>SteveOstrowski</dc:creator>
      <dc:date>2026-03-08T19:07:50Z</dc:date>
    </item>
  </channel>
</rss>

