Hi @AnandGNR
The error message "Fetch request failed due expired user session" is a bit misleading here — it's not actually about your browser session. It's a known quirk of the Databricks secret scope creation UI/API when it can't successfully reach or validate the Azure Key Vault endpoint during the creation flow. Here's a structured breakdown of what to check:
1. The Real Culprit: Network Connectivity from Databricks to AKV
Your Key Vault has "Public access from selected networks" enabled with a Private Endpoint. This is almost certainly the root cause.
When Databricks tries to validate the AKV URI during secret scope creation, that call originates from the Databricks control plane, not your browser or your workspace VNet. If the AKV firewall doesn't allow the Databricks control plane IPs, the handshake fails — and the UI surfaces this confusingly as a "session" error.
Fix options:
Temporarily set AKV to "Allow public access from all networks" while creating the scope, then lock it back down. This is the fastest way to confirm if networking is the issue.
Add Databricks control plane IPs to the AKV firewall allowlist. For Azure Databricks on Azure, the control plane egress IPs vary by region. Find yours here: Azure Databricks IP addresses
Ensure "Allow trusted Microsoft services to bypass this firewall" is checked in your AKV networking tab — Databricks (as an Azure-native service) can leverage this.
2. Access Policy vs. RBAC Mode Mismatch
You mentioned setting Get and List on the AzureDatabricks service principal via Access Policies. Make sure your AKV is actually in Vault Access Policy mode, not Azure RBAC mode — these are mutually exclusive.
Go to AKV → Access configuration → check the Permission model
If it's set to Azure role-based access control, your Access Policy grants are being silently ignored. In that case, assign the Key Vault Secrets User role to the AzureDatabricks SP via IAM instead
3. Use the CLI Instead of the UI
The #secrets/createScope UI page in Databricks is notoriously fragile for AKV-backed scopes. Use the REST API or Databricks CLI directly — it gives much better error messages:
databricks secrets create-scope \
--scope <scope-name> \
--scope-backend-type AZURE_KEYVAULT \
--resource-id /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.KeyVault/vaults/<vault-name> \
--dns-name https://<vault-name>.vault.azure.net/
If this fails, the error returned will be far more specific than the UI's generic session message.
4. Verify the AzureDatabricks SP is Registered in Your Tenant
The AzureDatabricks enterprise application must exist in your AAD tenant (it's auto-created the first time someone logs into Databricks in the tenant, but occasionally it's missing).
az ad sp list --display-name "AzureDatabricks" --query "[].{AppId:appId, Id:id}"
If this returns empty, you need to provision it:
az ad sp create --id 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d
That GUID is the well-known App ID for the AzureDatabricks first-party application.
Recommended Sequence
Confirm the AzureDatabricks SP exists in your tenant (step 4)
Confirm AKV permission model matches how you granted access (step 2)
Temporarily open AKV firewall to all networks and retry via CLI (steps 1 + 3)
If it works, re-lock the firewall and add the correct control plane IPs to the allowlist
The combination of the Private Endpoint + selected-network firewall is the most common cause of this exact symptom. The CLI approach will also get you out of the UI's session-validation quirks entirely.
LR