cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Using Azure Key Vault secret to access Azure Storage

phguk
New Contributor III

I am trying to configure access to Azure Storage Account (ADLS2) using OAUTH.  The doc here gives an example of how to specify a secret in a cluster's spark configuration

{{secrets/<secret-scope>/<service-credential-key>}}

I can see how this works for secrets stored in a Databricks-backed key vault.

Instead however, I want to access/use a secret stored in an Azure key vault which has its own url etc. 

Any insight & a working example is much appreciated. Thanks Paul

4 REPLIES 4

szymon_dybczak
Contributor III

Hi @phguk , 

You can certainly used azure key vault secret scope in databricks. 

To reference secrets stored in an Azure Key Vault, you can create a secret scope backed by Azure Key Vault. You can then leverage all of the secrets in the corresponding Key Vault instance from that secret scope.  

To open Databricks secret visit the home page of your Databricks workspace and use url https://<Databricks_url>#secrets/createScope.

Below is the step by step guide how to do it: 

 

https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#--create-an-azure-...

phguk
New Contributor III

Thanks for this link. I've followed its instructions but am stuck on the following. The doc instructs "Set Permission model to Vault access policy" but my org insists on RBAC.  Using a notebook in Databricks, I run the following which refers to a scope/key in Azure Key Vault and see

%scala 
val blob_storage_account_access_key = dbutils.secrets.get(scope = "PGSCOPE3", key = "PGKEYA") 

 "com.databricks.common.client.DatabricksServiceHttpClientException: PERMISSION_DENIED: Invalid permissions on the specified KeyVault"

My userid has key vault administrator role so I'm wondering how I give Databricks access to the key vault ? Any further advice, gratefully received. Thanks Paul

szymon_dybczak
Contributor III

Hi @phguk ,

To make it work you need to assign role of Key Vault Secret User to built in databricks service principal. Key vault administrator role that you picked is about control plane in azure. (If you are interested in this topic type in Google azure control plane vs data plane).

To simplify things, just follow below video and it'll work 🙂

https://youtu.be/NQv8a8MSVls

phguk
New Contributor III

Many thanks for the link to the useful video. I've now been able to successfully use an Azure Key Vault with Databricks. I have a couple of follow-on questions to pose, if I may:

1. My Azure admin is concerned that the requirement to give AzureDatabricks enterprise application the role of Key Vault Secrets User potentially allows any Databricks workspace in the tenant to access my key vault.  This concern was echoed in a 2022 discussion here. Is there an acknowledgement there's a need to provide better access granularity ?

2. Why is there no GUI access in Databricks to menu/dialog for managing scopes ? Sure it's not onerous to manually create the url adding #secrets/createScope but I am curious why there's no built-in link to this page ? Is this an attempt at security by obscurity ?

Many thanks again.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group