07-09-2025 01:49 PM
Hello -
Seeing an issue where notebook triggered from ADF is not able to access secret scopes, which was working earlier.
Here are the steps I did
1. Provide ADF contributor role permission in databrick workspace. - we tested this and were able to trigger notebook and even access secrets.
2. Manually deleted the service principal in workspace settings. Removed the contributor role for ADF from data bricks and added it back with same permission level. - > this time the jobs are failing while accessing db secrets. Can not even list the secret scopes. The error we are getting is 'Invalid access token' whenever I try to list or use the scopes.
I tried removing the service principal and adding back many times but still same issue persists.
07-10-2025 04:29 AM
Hi @gs2, deleting and re-adding the service principal resets its identity context. Even if the same permissions are applied, Databricks treats it as a new identity.
Try generating a new client secret in Azure AD and update your ADF Databricks linked service with it. Then, re-add the service principal to the Databricks workspace and make sure it has the necessary permissions.
07-10-2025 06:31 AM
Hello,
thank you . But I am not using secret in linked service . I am using managed identity service for authentication, which is working fine . I am able to run the notebook from ADF but not able to list or use the secrets in databricks.
07-10-2025 12:26 PM
Probably you can assign the managed identity admin in the workspace. I believe the permissions are not set at the keyvault. Use this code below.
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
w.secret.put_acl(scope = {scope_name},permissions = workspace.AclPermission.MANAGE,principal= {SPN_name})
07-11-2025 11:56 AM
Thank you , I already tried that . Giving permission to principal explicitly. It still had same issues.. not able to list scopes or get scopes .
3 weeks ago
I'm experiencing the same issue. Do you know how to solve this?
3 weeks ago
Hey @gs2 @IkuyoshiKuroda ,
I have reviewed the documentation:
According to the Databricks documentation on secret scopes, there are two types:
Databricks-backed scopes → secrets are stored inside Databricks. These do not support authentication via Azure Managed Identities.
databricks secrets put-acl <scope-name> <principal> <permission>
"The principal field specifies an existing Azure Databricks principal."
Azure Key Vault–backed scopes → secrets are stored in your Key Vault. These do work with Managed Identity, as long as the MSI has permission on the Key Vault. "You must have the Key Vault Contributor, Contributor, or Owner role on the Azure key vault instance that you want to use to back the secret scope."
If your ADF pipeline is using a Managed Identity, then you need to:
Create a Key Vault–backed secret scope in Databricks that points to your Key Vault.
Assign your ADF managed identity the proper role on that Key Vault, such as Key Vault Secrets User, with get and list permissions.
Access secrets from Databricks as usual:
dbutils.secrets.get(scope="my_kv_scope", key="my_secret")
If your first setup was working, it’s possible the scope you used originally was already Key Vault–backed, and deleting/re-adding the service principal broke the access configuration. Double-check the Key Vault role assignments for your ADF MSI and make sure the scope in Databricks still points to the correct Key Vault resource ID and DNS name.
🙂
Isi
3 weeks ago
Thank you.
After setting the scope appropriately in [Databricks-backed scopes], it worked fine.
ikuyoshi
3 weeks ago
Hi @Isi I really like the way you put the resolution over here. Good to learn from it.
3 weeks ago
Thanks you @Khaja_Zaffer I really try my best to provide well explained solutions. Its a motivation receive this kind of messages
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now