Referring to Azure Keyvault secrets in spark config
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-05-2022 04:11 AM
Hi all
In spark config for a cluster, it works well to refer to a Azure Keyvault secret in the "value" part of the name/value combo on a config row/setting.
For example, this works fine (I've removed the string that is our specific storage account name):
fs.azure.account.oauth2.client.secret.<storage_account_name>.dfs.core.windows.net {{secrets/secret_scope/secret_value}}
But is it possible to refer to a secret inside of the name-string of the config-row? More specifically, in the example above I would like to have the <storage_account_name> dynamic, using a secret (or any other way) so that it does not need to be hard-coded. Then we would have a more generic and re-usable spark config.
I actually tried this, but it doesn't seem to work:
fs.azure.account.oauth2.client.secret.{{secrets/secret_scope/storage_account_name_secret}}.dfs.core.windows.net {{secrets/secret_scope/secret_value}}
Is there a way to achieve this?
Many thanks,
Martin
- Labels:
-
Azure key vault
-
Spark config
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-28-2022 05:24 PM
Hi @Martin Aronsson,
Just a friendly follow-up. Do you still are looking for help or Kaniz's response help you to resolved your issue?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2023 03:31 AM
Hello,
Is there any update on this issue please?
Databricks no longer recommend mounting external location, so the other way to access Azure storage is to use spark config as mentioned in this document - https://learn.microsoft.com/en-us/azure/databricks/storage/azure-storage#connect-to-azure-data-lake-...
Although the spark config works fine, but as @Martin1 mentioned, the value for storage-account and directory-id cannot be accessed from the secret because they are part of the property name.
Is there a way of accessing these from Databricks secret? as hardcoding these values doesn't seem right.
Thanks,
Kalyani
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)