cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Materialized view creation fails

PNC
Databricks Partner

Hi,

I have ran into a problem when creating materialized view.

Here's my simple query I'm trying to run:

%sql
create or replace materialized view catalog.schema.mView_test
as select * from catalog.schema.table limit 10;

I'm getting following error:

Encountered an error with Unity Catalog while setting up the pipeline on cluster xxxx-xxxxxx-xxxxxxxx-xxx. 
Ensure that your Unity Catalog configuration is correct, and that required resources (e.g., catalog, schema) exist and are accessible. 
Also verify that the cluster has appropriate permissions to access Unity Catalog.

Details: Operation failed: "This request is not authorized to perform this operation.", 403, GET, https://storageaccount.dfs.core.windows.net/container?upn=false&beginFrom=0000000000000000000&resource=filesystem&maxResults=5000&directory=catalog/schema/__unitystorage/schemas/9e94de07-1f9d-4798-b250-d34f6f2b769d/tables/b34a78c7-fbcc-4265-a331-da4372e59afc/_delta_log&timeout=90&recursive=false&st=2026-04-16T07:00:09Z&sv=2020-02-10&ske=2026-04-16T09:00:09Z&sig=XXXXX&sktid=2fb08174-a150-479d-8d15-2174da71a11a&se=2026-04-16T08:17:22Z&sdd=7&skoid=1456c2e6-8869-41a4XXXXXXXXXXXXXXXXXX&spr=https&sks=b&skt=2026-04-16T07:00:09Z&sp=racwdxlm&skv=2025-01-05&sr=d, AuthorizationFailure, , "This request is not authorized to perform this operation. RequestId:1b89a867-b01f-0056-1b71-cdf6f8000000 Time:2026-04-16T07:17:26.4365052Z"

I'm running the query on our own SQL Warehouse, not serverless SQL warehouse. 

I have made sure the following:

  1. I have permissions to catalog and schema
  2. Browsing external location works
  3. Access connector that the storage credential is mapped to has Storage Blob Data Contributor credentials in Storage Account

My suspicion is that the culprit here is that the materialized views are backed by serverless pipeline even I'm not using serverless compute to run my notebook. Could this be the issue here? If so, how do I fix this?

2 REPLIES 2

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @PNC,

I don't think it has to do with the serverless compute to run the notebook. I'm just wondering if it's related to your access to the underlying storage.

Can you try the below.

In Catalog Explorer, open catalog → schema → check the Storage / managed location section and note which storage credential is attached.

In the Databricks account console, open that storage credential and note which access connector / managed identity / service principal it uses.

In the Azure Portal for storageaccount:

  • Under Access control (IAM), confirm that this exact identity has Storage Blob Data Contributor (or Owner) scoped to the storage account or at least the container that holds catalog/schema/__unitystorage/....
  • If you only granted Blob Data Contributor to an access connector used for a different external location, that won’t help this MV backing location.

Also, can you confirm you can read the base table from the same cluster/warehouse?

Just run something like the below.

SELECT COUNT(*) FROM catalog.schema.table;
If this fails with a UC permission error, fix catalog/schema/table grants first.

You may also want to check and ensure compute is UC-compatible (shared cluster or SQL warehouse. Not legacy/no-isolation single-user only). If other UC tables in this catalog work from this compute, you’re probably fine.

If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.

 

 

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***

PNC
Databricks Partner
Schema's storage location is something like this:

 

abfss://my-container@my-storage-account.dfs.core.windows.net/catalog/schema/__unitystorage/schemas/xxx-xxx-xxx-xxx-xxx

 

I have external location called "container_catalog" for URL abfss://my-container@my-storage-account.dfs.core.windows.net/catalog

 

Storage Credential for this location is called "my_credential" and it's connector id is /subscriptions/xxx-xxx-xxx-xxxx-xxx/resourceGroups/my-resource-group/providers/Microsoft.Databricks/accessConnectors/my-access-connector

 

Now when I go Azure portal and navigate to storage account "my-storage-account" and open up IAM, I can see that my-access-connector has Storage Blob Data Contributor role assigned to it (scoped to storage account).

 

When I run 
SELECT COUNT(*) FROM catalog.schema.table;
I get the row count as expected.