cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

AWS Secrets Works In One Cluster But Not Another

dbdude
New Contributor II

Why can I use boto3 to go to secrets manager to retrieve a secret with a personal cluster but I get an error with a shared cluster?

NoCredentialsError: Unable to locate credentials

 

7 REPLIES 7

Kaniz_Fatma
Community Manager
Community Manager

Hi @dbdudeThe issue you're experiencing with a shared cluster could be due to several reasons, but based on the error message "NoCredentialsError: Unable to locate credentials", it seems like the shared cluster might not have the correct permissions to retrieve secrets from the Secrets Manager.

Here are some potential causes and respective solutions:

โ€ข Misconfiguration of the secret: Ensure the secret is correctly configured in the Secrets Manager. You can check this by trying to fetch the secret in a notebook as the cluster owner. If it fails, you'll need to fix the configuration problem.โ€ข Too many API requests from the customer workspace: If the secret API for the workspace returned 429, it means there were too many requests. You may need to spread out the times of cluster/job submissions.

โ€ข Secret Manager service outage or Azure Key Vault service outage: Check the Grafana dashboard to see if there was an outage on the secret manager and if there were AKV operation failures. If there was a secret-manager outage, you must raise the ES to the service infra team. If there were AKV operation failures on an Azure workspace, you should ask MSFT to contact the AKV team.

โ€ข Permissions: Ensure that the shared cluster has the necessary permissions to access the Secrets Manager. This could be a "Can Attach To" or a "Can Restart" permission, depending on the exact requirements of your setup. Remember always to use secrets to store sensitive information, such as passwords, instead of plaintext. You can reference a mystery in the Spark configuration using the syntax spark.{{secrets}}.

Szpila
New Contributor III

Nice reply from chat GPT, but it seems that the true cause is that Databricks intentionally prevent prevent users from using the credentials of the host machine.

Hi @Szpila, I apologize for the oversight. Let me provide you with a more accurate answer to address your question.

drii_cavalcanti
New Contributor III

Hey @Szpila , have you found a solution for it? I am currently encountering the same issue.

Husky
New Contributor III

Hey @dbdude, I am facing the same error. Did you find a solution to access the AWS credentials on a Shared Cluster?

This article describes a way of storing credentials in a Unity Catalog Volume to fetch by the Shared Cluster:

https://medium.com/@amlucius/securely-accessing-external-services-on-databricks-shared-clusters-with...

But I am not a fan of storing the credentials in a Bucket..

@Kaniz_Fatma The reason why fetching the AWS credentials on a Shared Cluster does not work is a limitation of the network and file system access of Shared Clusters. See https://docs.databricks.com/en/compute/access-mode-limitations.html

Cannot connect to the instance metadata service (IMDS), other EC2 instances, or any other services running in the Databricks VPC. This prevents access to any service that uses the IMDS, such as boto3 and the AWS CLI.

 

Kaniz_Fatma
Community Manager
Community Manager

Hi @HuskyThank you for sharing the link and providing additional context!

Letโ€™s dive into the issue of accessing AWS credentials on a Databricks Shared Cluster.

The limitation youโ€™re encountering is related to the network and file system access of Shared Clusters. Specifically, Shared Clusters cannot connect to the Instance Metadata Service (IMDS), other EC2 instances, or any other services running within the Databricks VPC. This restriction prevents access to services that rely on the IMDS, including boto3 and the AWS CLI1.

Here are some considerations and alternative approaches:

  1. Unity Catalog Volume (UCV):

    • The article you mentioned proposes using a Unity Catalog Volume (UCV) to store credentials. While this approach can work, itโ€™s essential to understand the trade-offs.
    • UCVs provide a way to securely store secrets and configuration files within Databricks. However, they are specific to Databricks and not directly tied to AWS services.
    • By using a UCV, you can avoid storing credentials in a public bucket, but youโ€™ll still need to manage access to the UCV itself.
    • Keep in mind that UCVs are not accessible outside of Databricks, so if you need to use the same credentials elsewhere (e.g., in your local development environment), youโ€™ll need an additional mechanism.
  2. Alternative Approaches:

  3. AWS CLI and Boto3:

    • Since Shared Clusters cannot directly access the IMDS, you wonโ€™t be able to fetch AWS credentials using the standard methods (e.g., IAM roles).
    • Consider using an alternative approach such as:
      • Local Development: Fetch the credentials locally (outside of Databricks) and pass them explicitly to your code running on the Shared Cluster.
      • Environment Variables: Set environment variables within your Databricks notebook or job to provide the necessary credentials.
      • Secrets Management: Use a secrets management service (e.g., AWS Secrets Manager) to store and retrieve credentials securely. You can then fetch the secrets from your code running on the Shared Cluster.

Remember that any approach involving credentials should prioritize security and minimize exposure. Choose the method that aligns best with your use case and security requirements.

If you have further questions or need additional assistance, feel free to ask! ๐Ÿ˜Š๐Ÿ”‘

 

Kaniz_Fatma
Community Manager
Community Manager

Hi @dbdude and @drii_cavalcanti , The NoCredentialsError youโ€™re encountering when using Boto3 to retrieve a secret from AWS Secrets Manager typically indicates that the AWS SDK is unable to find valid credentials for your API request.

Letโ€™s explore some possible reasons and solutions:

  1. IAM Role and Permissions:

    • Personal Cluster: When using Boto3 with a personal cluster, itโ€™s likely that your IAM role or user account has the necessary permissions to access Secrets Manager. You might have configured your credentials (such as access key and secret key) correctly.
    • Shared Cluster: In a shared cluster, the IAM role or user account associated with the cluster might not have the required permissions. Ensure that the IAM role attached to the EC2 instance (if using EC2) or the IAM user (if using local development) has the necessary permissions to access Secrets Manager. Specifically, the role or user should have the secretsmanager:GetSecretValue permission.
    • Solution: Attach an IAM instance profile to the EC2 instance (for shared clusters) or configure the ...1.
  2. Misconfiguration of the Secret:

  3. Credentials Setup:

    • In your Python code, make sure youโ€™ve imported the boto3 library. To retrieve a secret, you need to know the name or ARN (Amazon Resource Name) of the secret.
    • Solution: Hereโ€™s a simple example of how to retrieve a secret using Boto3:
    import boto3
    from botocore.exceptions import ClientError
    
    def get_secret(secret_name):
        # Create a Secrets Manager client
        client = boto3.client('secretsmanager')
    
        try:
            # Attempt to retrieve the secret value
            get_secret_value_response = client.get_secret_value(SecretId=secret_name)
        except ClientError as e:
            # Handle exceptions if the secret can't be retrieved
            raise e
    
        # Process the retrieved secret
        if 'SecretString' in get_secret_value_response:
            secret = get_secret_value_response['SecretString']
        else:
            # For binary secrets, decode them before using
            secret = get_secret_value_response['SecretBinary'].decode('utf-8')
    
        return secret
    
    # Usage
    secret_name = "MySecretName"
    retrieved_secret = get_secret(secret_name)
    print(f"Retrieved secret: {retrieved_secret}")
    

    Remember to set up your AWS credentials (e.g., using the AWS CLI with aws configure) so that Boto3 c...34.

  4. Other Considerations:

    • Check if there are any network issues or connectivity problems preventing your cluster from reaching the AWS services.
    • Verify that the region specified in your Boto3 client matches the region where your secret is stored.

Remember to replace "MySecretName" with the actual name or ARN of your secret. If you follow these steps, you should be able to retrieve secrets successfully from both personal and shared clusters. If you encounter further issues, feel free to ask for more assistance! ๐Ÿ˜Š๐Ÿ”

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group