cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Unable to add column comment in Materialized View (MV)

leungi
Contributor

The following doc suggests the ability to add column comments during MV creation via the `column list` parameter.

Thus, the SQL code below is expected to generate a table where the columns `col_1` and `col_2` are commented; however, this is not the case.

Note that the underlying table for the MV, `my_streaming_table`, is a Streaming Table.

 

 

 

%sql
USE CATALOG my_catalog;

USE SCHEMA my_schema;

DROP MATERIALIZED VIEW IF EXISTS my_mv;

CREATE MATERIALIZED VIEW my_mv
  (col_1 COMMENT 'Column 1', col_2 COMMENT 'Column 2')
  COMMENT 'MV of a streaming table.'
AS (
  SELECT col_1 , col_2 FROM my_streaming_table
)

 

 

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

raphaelblg
Databricks Employee
Databricks Employee

@leungi that's because:

"All materialized views are backed by a DLT pipeline. You can refresh materialized views manually, on a schedule, or by scheduling the DLT pipeline in which theyโ€™re contained."

At https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view.html

In order to create Materialized Views, you need to use Delta Live Tables 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

View solution in original post

6 REPLIES 6

raphaelblg
Databricks Employee
Databricks Employee

Hello @leungi ,

I've reproduced the issue and it appears that specifying the data type of the column is necessary for the statement to work. Here's an example:

 

CREATE MATERIALIZED VIEW my_mv
 (col_1 STRING COMMENT 'Column 1', col_2 INT COMMENT 'Column 2')

 

I will consult with Databricks engineering to determine if this is the expected behavior or if there is potential for enhancement.

 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

@raphaelblg thanks for the reply.

I tried the suggested edits, but came up with the following error:

[DLT_VIEW_SCHEMA_WITH_TYPE_NOT_SUPPORTED] MATERIALIZED VIEW schemas with a specified type are supported only in a Delta Live Tables pipeline. SQLSTATE: 56038; line 1 pos 0

The docs for DLT to create materialized views doesn't indicate a way to add column comment as you noted.

raphaelblg
Databricks Employee
Databricks Employee

@leungi you've shared the python language reference. This is the SQL Reference from where I've based my example.

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Noted.

I tried the following in both SQL Editor and Notebook and got the same outcome:

 

%sql
USE CATALOG my_catalog;

USE SCHEMA my_schema;

DROP MATERIALIZED VIEW IF EXISTS my_mv;

CREATE MATERIALIZED VIEW my_mv
  (col_1 STRING COMMENT 'Column 1', col_2 STRING COMMENT 'Column 2')
  COMMENT 'MV of a streaming table.'
AS (
  SELECT col_1 , col_2 FROM my_streaming_table
)

 

 

raphaelblg
Databricks Employee
Databricks Employee

@leungi that's because:

"All materialized views are backed by a DLT pipeline. You can refresh materialized views manually, on a schedule, or by scheduling the DLT pipeline in which theyโ€™re contained."

At https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view.html

In order to create Materialized Views, you need to use Delta Live Tables 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Managed to get it working per your guidance ๐Ÿ˜€

For future me:

1. Create a SQL notebook.

 

 

CREATE OR REFRESH LIVE TABLE my_dlt_mv
(col_1 STRING COMMENT 'column_1', col_2 STRING COMMENT 'column_2')
COMMENT 'MV of a streaming table.'
AS (
  SELECT col_1 , col_2 FROM catalog.schema.streaming_table
)

 

 

 2. Create a pipeline for the notebook.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group