create databricks scope by reading AWS secrets manager
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2024 07:05 AM
Hi, I have datbricks on AWS, I created some secrets in AWS Secrets Manger, I would need to create the scopes based on AWS secrets manager.
When I use Azure's Key Vault, when creating the scope, it uses the option -scope-backend-type AZURE_KEYVAULT, but I didn't find it for AWS.
How would I create a scope with which to read the secrets from AWS Secrets Manager?, or would it only be possible via Python code?
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2024 07:15 AM
Step 1: Create Secret Scope
You can create a secret scope using the Databricks REST API as shown below:
python
import requests
import json
# Define the endpoint and headers
url = "https://<databricks-instance>/api/2.0/secrets/scopes/create"
headers = {
"Authorization": "Bearer <your-databricks-token>",
"Content-Type": "application/json"
}
# Define the payload
payload = {
"scope": "aws-secrets-scope",
"initial_manage_principal": "users"
}
# Make the request
response = requests.post(url, headers=headers, data=json.dumps(payload))
if response.status_code == 200:
print("Secret scope created successfully.")
else:
print(f"Failed to create secret scope: {response.text}")
Is this what you are looking for? Please test it once before deploying it in the workload.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2024 07:53 AM
That wouldn't be it, I created secrets within the AWS secret manager, when I use Azure Key Vault when creating the scope in Databricks, I pass a parameter that reads the key vault, but for AWS I didn't find it to read the AWS secrets manager .
I wanted to understand if it is not supported, or would it only be via boto3?

