I have read access to an S3 bucket in an AWS account that is not mine. For more than a year I've had a job successfully reading from that bucket using dbutils.fs.mount(...) and sqlContext.read.json(...). Recently the job started failing with the exception: "com.amazonaws.services.s3.model.AmazonS3Exception: The bucket is in this region: us-east-1. Please use this region to retry the request." on the sqlContext.read.json() command. My Databricks Cloud platform is in us-west-2, and the bucket may have been moved, but as far as I understand from this question this shouldn't be a problem: https://forums.databricks.com/questions/416/does-my-s3-data-need-to-be-in-the-same-aws-region.html . I have no problem accessing the bucket with boto3 (with no need to specify region). I've also tried setting the 'spark.hadoop.fs.s3a.endpoint' of the cluster to 's3.us-east-1.amazonaws.com' . Surprisingly this resulted in the same error on the mount() command saying that the bucket is in the 'us-west-2' region.
I'm a bit confused as to the possible causes of this error and would be happy for some pointers,
Welcome to Databricks Community: Lets learn, network and celebrate together
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.