cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue accessing databricks secrets from ADF

gs2
New Contributor II

Hello -

Seeing an issue where notebook triggered from ADF is not able to access secret scopes, which was working earlier. 

Here are the steps I did 

1. Provide ADF contributor role permission in databrick workspace. - we tested this and were able to trigger notebook and even access secrets.

2. Manually deleted the service principal in workspace settings. Removed the contributor role for ADF from data bricks and added it back with same permission level. - > this time the jobs are failing while accessing db secrets. Can not even list the secret scopes. The error we are getting is 'Invalid access token' whenever I try to list or use the scopes. 

I tried removing the service principal and adding back many times but still same issue persists. 

 

 

4 REPLIES 4

Renu_
Valued Contributor II

Hi @gs2, deleting and re-adding the service principal resets its identity context. Even if the same permissions are applied, Databricks treats it as a new identity.

Try generating a new client secret in Azure AD and update your ADF Databricks linked service with it. Then, re-add the service principal to the Databricks workspace and make sure it has the necessary permissions.

gs2
New Contributor II

Hello,

thank you . But I am not using secret in linked service . I am using managed identity service for authentication, which is working fine . I am able to run the notebook from ADF but not able to list or use the secrets in databricks. 

 

nayan_wylde
Honored Contributor

Probably you can assign the managed identity admin in the workspace. I believe the permissions are not set at the keyvault. Use this code below.

from databricks.sdk import WorkspaceClient
w = WorkspaceClient()

w.secret.put_acl(scope = {scope_name},permissions = workspace.AclPermission.MANAGE,principal= {SPN_name})

 

gs2
New Contributor II

Thank you , I already tried that . Giving permission to principal explicitly. It still had same issues.. not able to list scopes or get scopes .

 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now