cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to create secret scope -"Fetch request failed due expired user session"

AnandGNR
New Contributor II

Hi everyone,

I’m trying to create an Azure Key Vault-backed secret scope in a Databricks Premium workspace, but I keep getting this error:

 

Fetch request failed due expired user session

Setup details:

  • Databricks workspace: Premium
  • Azure Key Vault: Owner permissions on my account.
  • Databricks roles: Workspace & account admin 
  • AzureDatabricks SP: Get and List permissions with AcessPolicies
  • Key Vault Networking: firewall: Public access allowed from selected networks. It also has a Private Endpoint.

What I’ve tried: 

Still no luck.

Has anyone faced this issue? Any workarounds/pointers would be appreciated!

Thanks

2 REPLIES 2

lingareddy_Alva
Esteemed Contributor

Hi @AnandGNR 

The error message "Fetch request failed due expired user session" is a bit misleading here — it's not actually about your browser session. It's a known quirk of the Databricks secret scope creation UI/API when it can't successfully reach or validate the Azure Key Vault endpoint during the creation flow. Here's a structured breakdown of what to check:

1. The Real Culprit: Network Connectivity from Databricks to AKV
Your Key Vault has "Public access from selected networks" enabled with a Private Endpoint. This is almost certainly the root cause.
When Databricks tries to validate the AKV URI during secret scope creation, that call originates from the Databricks control plane, not your browser or your workspace VNet. If the AKV firewall doesn't allow the Databricks control plane IPs, the handshake fails — and the UI surfaces this confusingly as a "session" error.
Fix options:

Temporarily set AKV to "Allow public access from all networks" while creating the scope, then lock it back down. This is the fastest way to confirm if networking is the issue.
Add Databricks control plane IPs to the AKV firewall allowlist. For Azure Databricks on Azure, the control plane egress IPs vary by region. Find yours here: Azure Databricks IP addresses
Ensure "Allow trusted Microsoft services to bypass this firewall" is checked in your AKV networking tab — Databricks (as an Azure-native service) can leverage this.

2. Access Policy vs. RBAC Mode Mismatch

You mentioned setting Get and List on the AzureDatabricks service principal via Access Policies. Make sure your AKV is actually in Vault Access Policy mode, not Azure RBAC mode — these are mutually exclusive.

Go to AKV → Access configuration → check the Permission model
If it's set to Azure role-based access control, your Access Policy grants are being silently ignored. In that case, assign the Key Vault Secrets User role to the AzureDatabricks SP via IAM instead


3. Use the CLI Instead of the UI
The #secrets/createScope UI page in Databricks is notoriously fragile for AKV-backed scopes. Use the REST API or Databricks CLI directly — it gives much better error messages:

databricks secrets create-scope \
--scope <scope-name> \
--scope-backend-type AZURE_KEYVAULT \
--resource-id /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.KeyVault/vaults/<vault-name> \
--dns-name https://<vault-name>.vault.azure.net/

If this fails, the error returned will be far more specific than the UI's generic session message.

4. Verify the AzureDatabricks SP is Registered in Your Tenant
The AzureDatabricks enterprise application must exist in your AAD tenant (it's auto-created the first time someone logs into Databricks in the tenant, but occasionally it's missing).

az ad sp list --display-name "AzureDatabricks" --query "[].{AppId:appId, Id:id}"

If this returns empty, you need to provision it:
az ad sp create --id 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d

That GUID is the well-known App ID for the AzureDatabricks first-party application.

Recommended Sequence

Confirm the AzureDatabricks SP exists in your tenant (step 4)
Confirm AKV permission model matches how you granted access (step 2)
Temporarily open AKV firewall to all networks and retry via CLI (steps 1 + 3)
If it works, re-lock the firewall and add the correct control plane IPs to the allowlist

The combination of the Private Endpoint + selected-network firewall is the most common cause of this exact symptom. The CLI approach will also get you out of the UI's session-validation quirks entirely.

 

 

LR

szymon_dybczak
Esteemed Contributor III

Hi @AnandGNR ,

Try to do following. Go to your KeyVault, then in Firewalls and virtual networks set:

"Allow trusted Microsoft services to bypass this firewall."