- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2022 06:24 AM
We have a scenario where ideally we'd like to use Managed Identities to access storage but also secrets. Per now we have a setup with service principals accessing secrets through secret scopes, but we foresee a situation where we may get many service principals and the corresponding maintenance burden.
Looking at https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... it seems that Access Connectors would be a solution for the storage access part. But can we use "Access Connector for Azure Databricks" to access Azure Key Vault?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-09-2022 05:43 AM
I have unofficial word that this is not supported, and docs don't mention it. I have the feeling that even if I got it to work it should not be trusted for now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2022 06:38 AM
In what place exactly do you need to access key vault secrets?
Key vault can be integrated with databricks workspace under url
https://<YOUR_WORKSPACE>.azuredatabricks.net/#secrets/createScope
or via CLI/API
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2022 07:17 AM
Thanks for your response 🙂
We need to access secrets from notebooks and other tasks running interactively or in workflows.
We're actually using Azure Key Vault-backed secret scopes now, but we rely on service principals to access the keyvault through secret scope. Secret scopes are problematic, e.g. because they can't be created in a fully automated way, and access control must be managed in Databricks Secret ACLs instead of using Key Vault access control (like Azure RBAC). Service principals come with a maintenance burden for IT who needs to rotate credentials at regular intervals.
We're looking for ways to avoid having to manage service principals, and use Managed Identities instead.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-08-2022 07:08 AM
Hi Grazie,
Did you manage to get this to work?
I am trying to do the same but no luck so far. I keep getting INVALID_STATE: Databricks could not access keyvault: https://xxxx.vault.azure.net/.
Although I openen all network and assigned all Key Vault related roles I keep getting this error so I am wondering if it is supported at all...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-09-2022 05:43 AM
I have unofficial word that this is not supported, and docs don't mention it. I have the feeling that even if I got it to work it should not be trusted for now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2022 11:34 PM
Thanks for your response, Grive.
I ended up using the default Service principal for Databricks (AzureDatabricks).

