cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Urgent Assistance Needed – Unity Catalog Storage Access Failure & VM SKU Availability (Databricks on

suchitpathak08
Visitor

Hi everyone,

I’m running into two blocking issues while trying to run a Delta Live Tables (DLT) pipeline on Databricks (Azure). I’m hoping someone can help me understand what’s going wrong.

1. Unity Catalog cannot access underlying ADLS storage

Every DLT pipeline run fails with:

UNITY_CATALOG_INITIALIZATION_FAILED

INVALID_STATE.UC_CLOUD_STORAGE_ACCESS_FAILURE

AbfsRestOperationException

Even though:

  • The Access Connector’s managed identity has:
    • Storage Blob Data Owner
    • Storage Blob Data Contributor
  • My user account also has both of these roles.
  • ACLs on all containers (bronze, silver, gold, source, metastore, logs) have:
    • The user
    • The Access Connector managed identity
    • Read / Write / Execute for both:

All containers show correct ACLs (rwx), and IAM roles look correct at the storage account level.
But the pipeline still cannot initialize UC or access the storage.

2. VM size / SKU not available for DLT job compute

When the DLT pipeline tries to start a job cluster, I get:

The VM size you are specifying is not available (SkuNotAvailable)

QuotaExceeded: Required cores exceed available limit

Even small SKUs fail:

  • Standard_F4
  • Standard_DS3_v2
  • Standard_DS2_v2 (not visible in UI)
  • Standard_F2 (not visible in UI)

Azure CLI shows that many F-series SKUs exist in UK South, but in Databricks they fail to provision or don’t appear in the dropdown.

This makes it impossible to run even a minimal DLT cluster with 1 worker.

 

  1. Additional Symptoms
  • The pipeline UI sometimes hides the “Advanced Options” section for compute configuration (no worker/driver selector).
  • Creating a manual cluster shows warnings like:
    • “This account may not have enough CPU cores to satisfy this request”
    • “Estimated available: 2, requested: 8”
  • Even when using a 4-core node with only one worker, the VM still fails with SkuNotAvailable.

What I’m trying to understand

  1. Why UC still fails to access ADLS even when IAM + ACLs appear fully correct.
  2. Whether this is a region-wide VM capacity issue in UK South.
  3. Whether Databricks can enable smaller SKUs (F2 / DS2_v2) so DLT can run.
  4. Whether this is a misconfiguration in my workspace or an Azure capacity limitation.
  5. If the fix is:
    • an ACL/DL permissions change,
    • a quota request,
    • a VM selection change,
    • or migrating to another Azure region.

Any guidance would be hugely appreciated.

I’ve already:

  • Checked IAM
  • Checked container ACLs
  • Regenerated ACLs via Azure Portal
  • Validated the Access Connector identity
  • Tried multiple VM SKUs
  • Deleted and recreated the pipeline
  • Verified catalog and schema exist in Unity Catalog

Still getting the same two errors.

Thanks in advance to anyone who can help!

0 REPLIES 0