Hi there,
I am trying to upload a file to an s3 bucket. However, none of dbutils commands seem to work neither does the boto3 library. For clusters that have the configuration, except for the shared access mode, seem to work fine.
Those are the error messages that I am getting:
java.nio.file.AccessDeniedException: : Instantiate shaded.databricks.org.apache.hadoop.fs.s3a.auth.AssumedRoleCredentialProvider: com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: User: arn:aws:sts::*:assumed-role/[same as the instance profile config on the cluster]/i-* is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::*:role/[same as the instance profile config on the cluster] (Service: AWSSecurityTokenService; Status Code: 403; Error Code: AccessDenied)
java.nio.file.AccessDeniedException: s3://: shaded.databricks.org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by AwsCredentialContextTokenProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@*: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@*: User does not have any IAM roles]
Unable to locate credentials. You can configure credentials by running "aws configure". (Even though, it is configured during the starting time by an init-scripts)
Has anyone encountered this issue before? If so, is there anything that I am missing here?
Thank you so much,
Adriana Cavalcanti