3 weeks ago
Hi everyone,
Iโm trying to create an Azure Key Vault-backed secret scope in a Databricks Premium workspace, but I keep getting this error:
Fetch request failed due expired user session
Setup details:
What Iโve tried:
Still no luck.
Has anyone faced this issue? Any workarounds/pointers would be appreciated!
Thanks
3 weeks ago
Hi @AnandGNR
The error message "Fetch request failed due expired user session" is a bit misleading here โ it's not actually about your browser session. It's a known quirk of the Databricks secret scope creation UI/API when it can't successfully reach or validate the Azure Key Vault endpoint during the creation flow. Here's a structured breakdown of what to check:
1. The Real Culprit: Network Connectivity from Databricks to AKV
Your Key Vault has "Public access from selected networks" enabled with a Private Endpoint. This is almost certainly the root cause.
When Databricks tries to validate the AKV URI during secret scope creation, that call originates from the Databricks control plane, not your browser or your workspace VNet. If the AKV firewall doesn't allow the Databricks control plane IPs, the handshake fails โ and the UI surfaces this confusingly as a "session" error.
Fix options:
Temporarily set AKV to "Allow public access from all networks" while creating the scope, then lock it back down. This is the fastest way to confirm if networking is the issue.
Add Databricks control plane IPs to the AKV firewall allowlist. For Azure Databricks on Azure, the control plane egress IPs vary by region. Find yours here: Azure Databricks IP addresses
Ensure "Allow trusted Microsoft services to bypass this firewall" is checked in your AKV networking tab โ Databricks (as an Azure-native service) can leverage this.
2. Access Policy vs. RBAC Mode Mismatch
You mentioned setting Get and List on the AzureDatabricks service principal via Access Policies. Make sure your AKV is actually in Vault Access Policy mode, not Azure RBAC mode โ these are mutually exclusive.
Go to AKV โ Access configuration โ check the Permission model
If it's set to Azure role-based access control, your Access Policy grants are being silently ignored. In that case, assign the Key Vault Secrets User role to the AzureDatabricks SP via IAM instead
3. Use the CLI Instead of the UI
The #secrets/createScope UI page in Databricks is notoriously fragile for AKV-backed scopes. Use the REST API or Databricks CLI directly โ it gives much better error messages:
databricks secrets create-scope \
--scope <scope-name> \
--scope-backend-type AZURE_KEYVAULT \
--resource-id /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.KeyVault/vaults/<vault-name> \
--dns-name https://<vault-name>.vault.azure.net/
If this fails, the error returned will be far more specific than the UI's generic session message.
4. Verify the AzureDatabricks SP is Registered in Your Tenant
The AzureDatabricks enterprise application must exist in your AAD tenant (it's auto-created the first time someone logs into Databricks in the tenant, but occasionally it's missing).
az ad sp list --display-name "AzureDatabricks" --query "[].{AppId:appId, Id:id}"
If this returns empty, you need to provision it:
az ad sp create --id 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d
That GUID is the well-known App ID for the AzureDatabricks first-party application.
Recommended Sequence
Confirm the AzureDatabricks SP exists in your tenant (step 4)
Confirm AKV permission model matches how you granted access (step 2)
Temporarily open AKV firewall to all networks and retry via CLI (steps 1 + 3)
If it works, re-lock the firewall and add the correct control plane IPs to the allowlist
The combination of the Private Endpoint + selected-network firewall is the most common cause of this exact symptom. The CLI approach will also get you out of the UI's session-validation quirks entirely.
3 weeks ago
Thanks a lot LR for the detailed breakdown!
This was very helpful in narrowing things down. Iโve walked through the checks you outlined and wanted to share what Iโm seeing so far:
Error: Scope with Azure KeyVault must have userAADToken defined!
{
"scope": "scope_name",
"initial_manage_principal": "users",
"scope_backend_type": "AZURE_KEYVAULT",
"backend_azure_keyvault": {
"resource_id": "/subscriptions/<SUB_ID>/resourceGroups/<RG_NAME>/providers/Microsoft.KeyVault/vaults/<VAULT_NAME>",
"dns_name": "https://<VAULT_NAME>.vault.azure.net/"
}
}
Appreciate the direction so far. It definitely helped isolate this further!
Thanks,
3 weeks ago
Hi @AnandGNR ,
The userAADToken error means the CLI doesn't support AKV scope creation โ use the REST API directly with an Azure AD token instead of a Personal Access Token.
3 weeks ago
Hi LR,
This is the sanitized version of the error response using the AD token and the secrets api:
"ERROR: Response status code does not indicate success: 401 (Unauthorized).
{
"error_code": "CUSTOMER_UNAUTHORIZED",
"message": "Unable to grant read/list permission to Databricks service principal to KeyVault 'https://<VAULT_NAME>.vault.azure.net/': Status code 403, '{\"error\": {\"code\": \"RequestDisallowedByPolicy\", \"target\": \"<VAULT_NAME>\", \"message\": \"Resource '<VAULT_NAME>' was disallowed by policy. Policy identifiers: '[{\"policyAssignment\":{\"name\":\"Private endpoint must be configured for Key Vault\",\"id\":\"<POLICY_ASSIGNMENT_ID>\"},\"policyDefinition\":{\"name\":\"Key Vault - Private endpoint must be configured\",\"id\":\"<POLICY_DEFINITION_ID>\",\"version\":\"1.0.0\"}}]'\"}, \"additionalInfo\": [{\"type\": \"PolicyViolation\", \"info\": {\"evaluationDetails\": {\"evaluatedExpressions\": [{\"result\": \"True\", \"expressionKind\": \"Field\", \"expression\": \"type\", \"path\": \"type\", \"expressionValue\": \"Microsoft.KeyVault/vaults\", \"targetValue\": \"Microsoft.KeyVault/vaults\", \"operator\": \"Equals\"}, {\"result\": \"True\", \"expressionKind\": \"Field\", \"expression\": \"Microsoft.KeyVault/vaults/privateEndpointConnections\", \"path\": \"properties.privateEndpointConnections\", \"targetValue\": \"false\", \"operator\": \"Exists\"}]}, \"policyDefinitionId\": \"<POLICY_DEFINITION_ID>\", \"policyDefinitionName\": \"<POLICY_DEFINITION_ID>\", \"policyDefinitionDisplayName\": \"Key Vault - Private endpoint must be configured\", \"policyDefinitionVersion\": \"1.0.0\", \"policyDefinitionEffect\": \"deny\", \"policyAssignmentId\": \"<POLICY_ASSIGNMENT_ID>\", \"policyAssignmentName\": \"<POLICY_ASSIGNMENT_ID>\", \"policyAssignmentDisplayName\": \"Private endpoint must be configured for Key Vault\", \"policyAssignmentScope\": \"<POLICY_SCOPE>\", \"policyAssignmentParameters\": {}, \"policyExemptionIds\": [], \"policyEnrollmentIds\": []}}]}'",
"details": [
{
"@type": "type.googleapis.com/google.rpc.RequestInfo",
"request_id": "<REQUEST_ID>",
"serving_data": ""
}]
}"
Thanks,
3 weeks ago
Hi @AnandGNR ,
My understanding based on above error was your org has a policy: "Private endpoint must be configured for Key Vault" with effect deny. When Databricks tries to programmatically grant its SP Get/List on your vault during scope creation, Azure Policy intercepts that ARM call and blocks it because the vault modification is being initiated outside the private endpoint path.
The ball is in your Azure Policy admin's court. The Databricks control plane has no way to route its ARM calls through a private endpoint.
This is purely my understanding , please talk to you Azure policy Admin @AnandGNR .
3 weeks ago
Hi @AnandGNR ,
Try to do following. Go to your KeyVault, then in Firewalls and virtual networks set:
"Allow trusted Microsoft services to bypass this firewall."
3 weeks ago
Hi @szymon_dybczak : Confirming this was already in place from the onset. Thanks!