cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

The cluster with the instance profile cannot access the S3 bucket. 403 permission denied is thrown

User16752239289
Databricks Employee
Databricks Employee

The document has been followed to configure the instance profile. The ec2 instance is able to access the S3 bucket when configured the same instance profile. However, the cluster configured to use the same instance profile failed to access the S3 bucket due to permission denied.

1 REPLY 1

User16752239289
Databricks Employee
Databricks Employee

I suspect this is due to AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY has been added to the spark environmental variable.

You can run %sh env | grep -i aws on your cluster and make sure AWS_ACCESS_KEY_ID is not present.

If so, then please remove it either from the cluster advanced options -> spark -> Environment Variables or from the init scripts. Please make sure check both cluster scope init script and global init script.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group