When running a databricks notebook connected to an s3 cluster I randomly but frequently experience the following error:
java.nio.file.AccessDeniedException: s3://mybucket: getFileStatus on s3://mybucket: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden; request: HEAD https://mybucket.s3.eu-west-2.amazonaws.com hybrid {} Hadoop 3.3.4, aws-sdk-java/1.12.189 Linux/5.15.0-1045-aws OpenJDK_64-Bit_Server_VM/25.362-b09 java/1.8.0_362 scala/2.12.14 vendor/Azul_Systems,_Inc. cfg/retry-mode/legacy com.amazonaws.services.s3.model.GetObjectMetadataRequest; Request ID: BPE1N5PBRHX6AXXX, Extended Request ID: A9yWOO63cwg7PgdVihRC/bqVlOBwElgqYliThPpm56lH0lM/Xf09+g8Dkzdylpp422togli3000=, Cloud Provider: AWS, Instance ID: i-0ed293c3b4d8b0e15 credentials-provider: com.amazonaws.auth.AnonymousAWSCredentials credential-header: no-credential-header signature-present: false (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: BPE1N5PBRHX6AXXX; S3 Extended Request ID: A9yWOO63cwg7PgdVihRC/bqVlOBwElgqYliThPpm56lH0lM/Xf09+g8Dkzdylpp422togli3000=; Proxy: null), S3 Extended Request ID: A9yWOO63cwg7PgdVihRC/bqVlOBwElgqYliThPpm56lH0lM/Xf09+g8Dkzdylpp422togli3000=:403 Forbidden
When submitting jobs to the cluster using vs code I don't have this issue. Is there a best practice for setting up s3 in notebooks? The error is intermittent but happens frequently without any obvious cause and interrupts development.
Any help is greatly appreciated!