cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

"AmazonS3Exception: The bucket is in this region" error

DanielAnderson
New Contributor

I have read access to an S3 bucket in an AWS account that is not mine. For more than a year I've had a job successfully reading from that bucket using dbutils.fs.mount(...) and sqlContext.read.json(...). Recently the job started failing with the exception: "com.amazonaws.services.s3.model.AmazonS3Exception: The bucket is in this region: us-east-1. Please use this region to retry the request." on the sqlContext.read.json() command. My Databricks Cloud platform is in us-west-2, and the bucket may have been moved, but as far as I understand from this question this shouldn't be a problem: https://forums.databricks.com/questions/416/does-my-s3-data-need-to-be-in-the-same-aws-region.html . I have no problem accessing the bucket with boto3 (with no need to specify region). I've also tried setting the 'spark.hadoop.fs.s3a.endpoint' of the cluster to 's3.us-east-1.amazonaws.com' . Surprisingly this resulted in the same error on the mount() command saying that the bucket is in the 'us-west-2' region.

I'm a bit confused as to the possible causes of this error and would be happy for some pointers,

Thanks!

1 REPLY 1

Chandan
New Contributor II

@andersource

Looks like the bucket is in

us-east-1
but you've configured your
AmazonS3
Cloud platform with
us-west-2
. Can you try switching configuring the client to use
us-east-1
?

I hope it will work for you. Thank you

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.