Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Showing results for 
Search instead for 
Did you mean: 

Unable to write to S3 bucket from Databricks using boto3

New Contributor II

I am unable to write data from Databricks into an S3 bucket. I have set up the permissions both on the bucket policy level, and the user level as well (Put, List, and others are added, have also tried with s3*). 

Bucket region and workspace region are same. I have tried using a cluster with a cross account instance profile as well, which has all the required permissions, both on IAM role level, and bucket policy level. have tried removing the Block all access check box, but getting the same error. 


Code used : 

import boto3
#Values to connect to s3
aws_bucket_name = "<bucket_name>"
ddlscriptextlocation = f"s3://{aws_bucket_name}/sample/"
path = "s3a://{aws_bucket_name}/"
AWS_ACCESS_KEY_ID = "<my_key_ID>"
s3 = boto3.resource(
content="String content to write to a new S3 file"
s3.Object(aws_bucket_name, f"s3://{aws_bucket_name}/sample/newfile.txt").put(Body=content)

Error seen : 
ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied

I also tried to mount the bucket to DBFS, it got mounted, but I receive a 403 Forbidden error while trying to view it using . 

Any help on this is greatly appreciated. Do let me know if any other information is required from me. Thanks in advance!

Community Manager
Community Manager

Hi @Debi-Moha

  • Ensure that the IAM role associated with your Databricks cluster has the necessary permissions to access the S3 bucket. Specifically, it should have permissions for s3:PutObject and s3:ListBucket.
  • Double-check that the IAM role is correctly configured with the appropriate policies granting these permissions.
  • Verify that the bucket policy allows the IAM role associated with your Databricks cluster to perform the required actions (e.g., s3:PutObject, s3:ListBucket).
  • Make sure the bucket policy does not inadvertently restrict access.
  • If you’re using a cross-account instance profile, ensure that it has the necessary permissions.
  • Verify that the instance profile is correctly associated with your Databricks cluster.
  • Confirm that the region of your S3 bucket matches the region specified in your Databricks code ('ap-south-1' in your case).
  • Check if the AWS access key and secret key (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) are correctly set in your Databricks environment.
  • Make sure there are no typos or issues with these credentials.
  • Ensure that the bucket is not configured to block all public access. You mentioned trying to remove the “Block all access” checkbox, but still encountering the error. Double-check this setting.
  • When mounting the bucket to DBFS, ensure that the IAM role or instance profile used for mounting has the necessary permissions.
  • Verify that the mount point is correctly created and accessible.
  • The specific error message you received is: ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied. This suggests that the IAM role or credentials lack the required permissions.
  • If you have any additional logs or details related to the error, please share them. It might help pinpoint the issue further.
  • If you need further assistance, feel free to provide additional information or logs, and we’ll continue troubleshooting!