I am unable to write data from Databricks into an S3 bucket. I have set up the permissions both on the bucket policy level, and the user level as well (Put, List, and others are added, have also tried with s3*).
Bucket region and workspace region are same. I have tried using a cluster with a cross account instance profile as well, which has all the required permissions, both on IAM role level, and bucket policy level. have tried removing the Block all access check box, but getting the same error.
Code used :
import boto3
#Values to connect to s3
aws_bucket_name = "<bucket_name>"
ddlscriptextlocation = f"s3://{aws_bucket_name}/sample/"
path = "s3a://{aws_bucket_name}/"
AWS_SECRET_ACCESS_KEY = "<my_key>"
AWS_ACCESS_KEY_ID = "<my_key_ID>"
s3 = boto3.resource(
's3',
region_name='ap-south-1',
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY
)
content="String content to write to a new S3 file"
s3.Object(aws_bucket_name, f"s3://{aws_bucket_name}/sample/newfile.txt").put(Body=content)
Error seen :
ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
I also tried to mount the bucket to DBFS, it got mounted, but I receive a 403 Forbidden error while trying to view it using dbutils.fs.ls .
Any help on this is greatly appreciated. Do let me know if any other information is required from me. Thanks in advance!