<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Materialized view creation fails in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154714#M54126</link>
    <description>&lt;P&gt;Hi,&lt;BR /&gt;&lt;BR /&gt;I have ran into a problem when creating materialized view.&lt;BR /&gt;&lt;BR /&gt;Here's my simple query I'm trying to run:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;%sql
create or replace materialized view catalog.schema.mView_test
as select * from catalog.schema.table limit 10;&lt;/LI-CODE&gt;&lt;P&gt;I'm getting following error:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Encountered an error with Unity Catalog while setting up the pipeline on cluster xxxx-xxxxxx-xxxxxxxx-xxx. 
Ensure that your Unity Catalog configuration is correct, and that required resources (e.g., catalog, schema) exist and are accessible. 
Also verify that the cluster has appropriate permissions to access Unity Catalog.

Details: Operation failed: "This request is not authorized to perform this operation.", 403, GET, https://storageaccount.dfs.core.windows.net/container?upn=false&amp;amp;amp;beginFrom=0000000000000000000&amp;amp;amp;resource=filesystem&amp;amp;amp;maxResults=5000&amp;amp;amp;directory=catalog/schema/__unitystorage/schemas/9e94de07-1f9d-4798-b250-d34f6f2b769d/tables/b34a78c7-fbcc-4265-a331-da4372e59afc/_delta_log&amp;amp;amp;timeout=90&amp;amp;amp;recursive=false&amp;amp;amp;st=2026-04-16T07:00:09Z&amp;amp;amp;sv=2020-02-10&amp;amp;amp;ske=2026-04-16T09:00:09Z&amp;amp;amp;sig=XXXXX&amp;amp;amp;sktid=2fb08174-a150-479d-8d15-2174da71a11a&amp;amp;amp;se=2026-04-16T08:17:22Z&amp;amp;amp;sdd=7&amp;amp;amp;skoid=1456c2e6-8869-41a4XXXXXXXXXXXXXXXXXX&amp;amp;amp;spr=https&amp;amp;amp;sks=b&amp;amp;amp;skt=2026-04-16T07:00:09Z&amp;amp;amp;sp=racwdxlm&amp;amp;amp;skv=2025-01-05&amp;amp;amp;sr=d, AuthorizationFailure, , "This request is not authorized to perform this operation. RequestId:1b89a867-b01f-0056-1b71-cdf6f8000000 Time:2026-04-16T07:17:26.4365052Z"&lt;/LI-CODE&gt;&lt;P&gt;I'm running the query on our own SQL Warehouse, not serverless SQL warehouse.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have made sure the following:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;I have permissions to catalog and schema&lt;/LI&gt;&lt;LI&gt;Browsing external location works&lt;/LI&gt;&lt;LI&gt;Access connector that the storage credential is mapped to has Storage Blob Data Contributor credentials in Storage Account&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;My suspicion is that the culprit here is that&amp;nbsp;&lt;A href="https://learn.microsoft.com/fi-fi/azure/databricks/ldp/dbsql/materialized#mv-requirements" target="_self"&gt;the materialized views are backed by serverless pipeline&lt;/A&gt; even I'm not using serverless compute to run my notebook. Could this be the issue here? If so, how do I fix this?&lt;/P&gt;</description>
    <pubDate>Thu, 16 Apr 2026 07:33:12 GMT</pubDate>
    <dc:creator>PNC</dc:creator>
    <dc:date>2026-04-16T07:33:12Z</dc:date>
    <item>
      <title>Materialized view creation fails</title>
      <link>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154714#M54126</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;&lt;BR /&gt;I have ran into a problem when creating materialized view.&lt;BR /&gt;&lt;BR /&gt;Here's my simple query I'm trying to run:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;%sql
create or replace materialized view catalog.schema.mView_test
as select * from catalog.schema.table limit 10;&lt;/LI-CODE&gt;&lt;P&gt;I'm getting following error:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Encountered an error with Unity Catalog while setting up the pipeline on cluster xxxx-xxxxxx-xxxxxxxx-xxx. 
Ensure that your Unity Catalog configuration is correct, and that required resources (e.g., catalog, schema) exist and are accessible. 
Also verify that the cluster has appropriate permissions to access Unity Catalog.

Details: Operation failed: "This request is not authorized to perform this operation.", 403, GET, https://storageaccount.dfs.core.windows.net/container?upn=false&amp;amp;amp;beginFrom=0000000000000000000&amp;amp;amp;resource=filesystem&amp;amp;amp;maxResults=5000&amp;amp;amp;directory=catalog/schema/__unitystorage/schemas/9e94de07-1f9d-4798-b250-d34f6f2b769d/tables/b34a78c7-fbcc-4265-a331-da4372e59afc/_delta_log&amp;amp;amp;timeout=90&amp;amp;amp;recursive=false&amp;amp;amp;st=2026-04-16T07:00:09Z&amp;amp;amp;sv=2020-02-10&amp;amp;amp;ske=2026-04-16T09:00:09Z&amp;amp;amp;sig=XXXXX&amp;amp;amp;sktid=2fb08174-a150-479d-8d15-2174da71a11a&amp;amp;amp;se=2026-04-16T08:17:22Z&amp;amp;amp;sdd=7&amp;amp;amp;skoid=1456c2e6-8869-41a4XXXXXXXXXXXXXXXXXX&amp;amp;amp;spr=https&amp;amp;amp;sks=b&amp;amp;amp;skt=2026-04-16T07:00:09Z&amp;amp;amp;sp=racwdxlm&amp;amp;amp;skv=2025-01-05&amp;amp;amp;sr=d, AuthorizationFailure, , "This request is not authorized to perform this operation. RequestId:1b89a867-b01f-0056-1b71-cdf6f8000000 Time:2026-04-16T07:17:26.4365052Z"&lt;/LI-CODE&gt;&lt;P&gt;I'm running the query on our own SQL Warehouse, not serverless SQL warehouse.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have made sure the following:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;I have permissions to catalog and schema&lt;/LI&gt;&lt;LI&gt;Browsing external location works&lt;/LI&gt;&lt;LI&gt;Access connector that the storage credential is mapped to has Storage Blob Data Contributor credentials in Storage Account&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;My suspicion is that the culprit here is that&amp;nbsp;&lt;A href="https://learn.microsoft.com/fi-fi/azure/databricks/ldp/dbsql/materialized#mv-requirements" target="_self"&gt;the materialized views are backed by serverless pipeline&lt;/A&gt; even I'm not using serverless compute to run my notebook. Could this be the issue here? If so, how do I fix this?&lt;/P&gt;</description>
      <pubDate>Thu, 16 Apr 2026 07:33:12 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154714#M54126</guid>
      <dc:creator>PNC</dc:creator>
      <dc:date>2026-04-16T07:33:12Z</dc:date>
    </item>
    <item>
      <title>Re: Materialized view creation fails</title>
      <link>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154723#M54127</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/141992"&gt;@PNC&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;I don't think it has to do with the serverless compute to run the notebook. I'm just wondering if it's related to your access to the underlying storage.&lt;/P&gt;
&lt;P&gt;Can you try the below.&lt;/P&gt;
&lt;P&gt;In Catalog Explorer, open catalog → schema → check the Storage / managed location section and note which storage credential is attached.&lt;/P&gt;
&lt;P&gt;In the Databricks account console, open that storage credential and note which access connector / managed identity / service principal it uses.&lt;/P&gt;
&lt;P&gt;In the Azure Portal for storageaccount:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Under Access control (IAM), confirm that this exact identity has Storage Blob Data Contributor (or Owner) scoped to the storage account or at least the container that holds catalog/schema/__unitystorage/....&lt;/LI&gt;
&lt;LI&gt;If you only granted Blob Data Contributor to an access connector used for a different external location, that won’t help this MV backing location.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Also, can you confirm you can read the base table from the same cluster/warehouse?&lt;/P&gt;
&lt;P&gt;Just run something like the below.&lt;/P&gt;
&lt;DIV class="l8rrz21 _1ibi0s3do" data-ui-element="code-block-container"&gt;
&lt;PRE&gt;&lt;CODE class="markdown-code-sql p8i6j0e hljs language-sql _12n1b832"&gt;&lt;SPAN class="hljs-keyword"&gt;SELECT&lt;/SPAN&gt; &lt;SPAN class="hljs-built_in"&gt;COUNT&lt;/SPAN&gt;(&lt;SPAN class="hljs-operator"&gt;*&lt;/SPAN&gt;) &lt;SPAN class="hljs-keyword"&gt;FROM&lt;/SPAN&gt; catalog.schema.table;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;DIV class="l8rrz23 _1ibi0s3d7 _1ibi0s332 _1ibi0s3dp _1ibi0s3bm _1ibi0s3ce"&gt;
&lt;DIV class="lqznwq0"&gt;&lt;SPAN&gt;If this fails with a UC permission error, fix catalog/schema/table grants first.&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;P class="p8i6j01 paragraph"&gt;You may also want to check and ensure compute is UC-compatible (shared cluster or SQL warehouse. Not legacy/no-isolation single-user only). If other UC tables in this catalog work from this compute, you’re probably fine.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 16 Apr 2026 10:03:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154723#M54127</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-16T10:03:13Z</dc:date>
    </item>
    <item>
      <title>Re: Materialized view creation fails</title>
      <link>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154730#M54129</link>
      <description>&lt;DIV&gt;&lt;SPAN&gt;Schema's storage location is something like this:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;abfss://my-container@my-storage-account.dfs.core.windows.net/catalog/schema/__unitystorage/schemas/xxx-xxx-xxx-xxx-xxx&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;I have external location called "container_catalog" for URL&amp;nbsp;abfss://my-container@my-storage-account.dfs.core.windows.net/catalog&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;Storage Credential for this location is called "my_credential" and it's connector id is&amp;nbsp;/subscriptions/xxx-xxx-xxx-xxxx-xxx/resourceGroups/my-resource-group/providers/Microsoft.Databricks/accessConnectors/my-access-connector&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;Now when I go Azure portal and navigate to storage account "my-storage-account" and open up IAM, I can see that my-access-connector has&amp;nbsp;Storage Blob Data Contributor role assigned to it (scoped to storage account).&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;SPAN&gt;When I run&amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;LI-CODE lang="markup"&gt;SELECT COUNT(*) FROM catalog.schema.table;&lt;/LI-CODE&gt;&lt;DIV&gt;&lt;SPAN&gt;I get the row count as expected.&lt;/SPAN&gt;&lt;/DIV&gt;</description>
      <pubDate>Thu, 16 Apr 2026 12:28:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154730#M54129</guid>
      <dc:creator>PNC</dc:creator>
      <dc:date>2026-04-16T12:28:30Z</dc:date>
    </item>
    <item>
      <title>Re: Materialized view creation fails</title>
      <link>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154736#M54130</link>
      <description>&lt;P&gt;There are multiple requirements for materialized views. You can check below&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;You must use a&amp;nbsp;Unity Catalog enabled &lt;STRONG&gt;pro&lt;/STRONG&gt; or serverless SQL warehouse.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;To incrementally refresh a&amp;nbsp;materialized view&amp;nbsp;from Delta tables, the source tables must have&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/ldp/dbsql/materialized#mv-row-tracking" target="_blank"&gt;row tracking enabled&lt;/A&gt;.&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;The owner (the user who creates the&amp;nbsp;materialized view) must have the following permissions:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;SELECT&amp;nbsp;privilege on the base tables referenced by the&amp;nbsp;materialized view.&lt;/LI&gt;&lt;LI&gt;USE CATALOG&amp;nbsp;and&amp;nbsp;USE SCHEMA&amp;nbsp;privileges on the catalog and schema containing the source tables for the&amp;nbsp;materialized view.&lt;/LI&gt;&lt;LI&gt;USE CATALOG&amp;nbsp;and&amp;nbsp;USE SCHEMA&amp;nbsp;privileges on the target catalog and schema for the&amp;nbsp;materialized view.&lt;/LI&gt;&lt;LI&gt;CREATE TABLE&amp;nbsp;and&amp;nbsp;CREATE MATERIALIZED VIEW&amp;nbsp;privileges on the schema containing the&amp;nbsp;materialized view.&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;Workspace must be in a region that supports&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/resources/feature-region-support#serverless-aws" target="_blank"&gt;serverless SQL warehouses&lt;/A&gt;.&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Thu, 16 Apr 2026 12:59:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154736#M54130</guid>
      <dc:creator>balajij8</dc:creator>
      <dc:date>2026-04-16T12:59:27Z</dc:date>
    </item>
    <item>
      <title>Re: Materialized view creation fails</title>
      <link>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154743#M54131</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/141992"&gt;@PNC&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Thanks for checking...&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I think your setup is very close. The missing piece is which identity is actually used for the MV backing storage, which is not necessarily the same as the one behind your external location.&lt;/P&gt;
&lt;P&gt;Because you’re already seeing a 403 from ADLS for the __unitystorage path, the serverless MV pipeline is actually starting, which is good. The failure is now purely an Azure Storage authorisation problem, not a serverless problem.&lt;/P&gt;
&lt;P&gt;SELECT COUNT(*) FROM catalog.schema.table reads from your external location abfss://my-container@my-storage-account.dfs.core.windows.net/catalog using storage credential my_credential (backed by my-access-connector), which has Blob Data Contributor. So, it works.&lt;/P&gt;
&lt;P&gt;Your create MV query writes MV data under the schema’s managed location (.../catalog/schema/__unitystorage/schemas/...).&lt;/P&gt;
&lt;P&gt;The serverless MV pipeline uses the identity associated with the catalog/schema’s managed storage / metastore default storage, which may be different from my_credential.&lt;/P&gt;
&lt;P&gt;So you’ve granted rights to my-access-connector (for the external location), but the identity actually used for __unitystorage/... is likely another access connector or managed identity that currently does not have rights on my-storage-account, hence the 403.&lt;/P&gt;
&lt;P&gt;Can you find which credential is used for the managed location&lt;/P&gt;
&lt;P&gt;You can do this by querying as shown below...&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;DIV data-ui-element="code-block-container"&gt;
&lt;PRE&gt;DESCRIBE CATALOG EXTENDED catalog;
DESCRIBE SCHEMA EXTENDED catalog.schema;&lt;/PRE&gt;
&lt;DIV&gt;
&lt;DIV&gt;or in Catalog Explorer by opening&amp;nbsp;catalog&amp;nbsp;--&amp;gt; Storage / Managed storage. Go to the Schema&amp;nbsp;and check its Managed location and Storage credential (if shown).&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;P&gt;You’re looking for the storage credential name (and thus the access connector / identity) that backs the managed location where __unitystorage/schemas/... lives. It may not be my_credential.&lt;/P&gt;
&lt;P&gt;In the Databricks account console, open the storage credential you found in above step and note its access connector / managed identity.&lt;/P&gt;
&lt;P&gt;In the Azure Portal... Go to storage account my-storage-account&amp;nbsp;--&amp;gt; Access control (IAM).&amp;nbsp;Add a role assignment as below..&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Role: Storage Blob Data Contributor (or Owner)&lt;/LI&gt;
&lt;LI&gt;Scope: the storage account (or at least my-container)&lt;/LI&gt;
&lt;LI&gt;Principal: the identity from that storage credential (not just my-access-connector if they differ).&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;After correcting storage permissions for the actual managed-location identity, rerun the below to see if it works&lt;/P&gt;
&lt;DIV data-ui-element="code-block-container"&gt;
&lt;PRE&gt;CREATE OR REPLACE MATERIALIZED VIEW catalog.schema.mView_test
AS SELECT * FROM catalog.schema.table LIMIT 10;&lt;/PRE&gt;
&lt;DIV&gt;
&lt;DIV&gt;If a DESCRIBE ... EXTENDED shows that the managed location is using some internal/default credential (not visible as my_credential), you’ll need to give that identity Blob Data Contributor on my-storage-account as well or move the schema’s managed storage to a credential you control.&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;</description>
      <pubDate>Thu, 16 Apr 2026 14:23:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/materialized-view-creation-fails/m-p/154743#M54131</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-16T14:23:25Z</dc:date>
    </item>
  </channel>
</rss>

