cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

The cluster with the instance profile cannot access the S3 bucket. 403 permission denied is thrown

User16752239289
Valued Contributor
Valued Contributor

The document has been followed to configure the instance profile. The ec2 instance is able to access the S3 bucket when configured the same instance profile. However, the cluster configured to use the same instance profile failed to access the S3 bucket due to permission denied.

1 REPLY 1

User16752239289
Valued Contributor
Valued Contributor

I suspect this is due to AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY has been added to the spark environmental variable.

You can run %sh env | grep -i aws on your cluster and make sure AWS_ACCESS_KEY_ID is not present.

If so, then please remove it either from the cluster advanced options -> spark -> Environment Variables or from the init scripts. Please make sure check both cluster scope init script and global init script.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.