Hi @PNC,
I don't think it has to do with the serverless compute to run the notebook. I'm just wondering if it's related to your access to the underlying storage.
Can you try the below.
In Catalog Explorer, open catalog → schema → check the Storage / managed location section and note which storage credential is attached.
In the Databricks account console, open that storage credential and note which access connector / managed identity / service principal it uses.
In the Azure Portal for storageaccount:
- Under Access control (IAM), confirm that this exact identity has Storage Blob Data Contributor (or Owner) scoped to the storage account or at least the container that holds catalog/schema/__unitystorage/....
- If you only granted Blob Data Contributor to an access connector used for a different external location, that won’t help this MV backing location.
Also, can you confirm you can read the base table from the same cluster/warehouse?
Just run something like the below.
SELECT COUNT(*) FROM catalog.schema.table;
If this fails with a UC permission error, fix catalog/schema/table grants first.
You may also want to check and ensure compute is UC-compatible (shared cluster or SQL warehouse. Not legacy/no-isolation single-user only). If other UC tables in this catalog work from this compute, you’re probably fine.
If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.
Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***