cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to grant custom container AWS credentials for reading init script?

mrstevegross
Contributor

I'm using a customer container *and* init scripts. At runtime, I get this error:

Cluster '...' was terminated. Reason: INIT_SCRIPT_FAILURE (CLIENT_ERROR). Parameters: instance_id:i-0440ddd3a2d5cce79, databricks_error_message:Cluster scoped init script s3://<our_bucket>/<our_init_script.sh> failed: Timed out with exception after 5 attempts (debugStr = 'Reading remote file for init script'), Caused by: com.databricks.objectstore.location.PermanentStorageException$AwsForbidden: Missing credentials to access AWS bucket.

It worked previously *without* the container, so I'm pretty sure the use of container is triggering the problem. I suspect that the fetch-init-scripts-from-s3 operation is occurring *inside* the container, and that the container itself lacks AWS credentials. What's the preferred way to pass in AWS credentials to a custom container?

1 ACCEPTED SOLUTION

Accepted Solutions

mrstevegross
Contributor

Followup: I got the AWS creds working by amending our AWS role to permit read/write access to our S3 bucket. Woohoo!

View solution in original post

3 REPLIES 3

Isi
Contributor

Hey! It would be great if you could share more details about the cluster type and access mode.

If you are using, for example, an all-purpose cluster with shared access mode, I recommend configuring the "Init Script" option inside the advanced cluster settings.

If Unity Catalog is enabled, ensure that your S3 path (`s3://<our_bucket>/<our_init_script.sh>`) is allowed under Catalog Explorer > Allowed Jars/init Scripts.

It would also be helpful to understand why you want to use the init script, as there might be other options available.

I hope you find this helpful. 🙂

mrstevegross
Contributor

>If you are using, for example, an all-purpose cluster with shared access mode, I recommend configuring the "Init Script" option inside the advanced cluster settings.

Yep, that's the approach. I've got init scripts specified in the cluster settings, and am encountering the "Missing credentials to access AWS bucket" when my job runs.

>It would also be helpful to understand why you want to use the init script, as there might be other options available.

We need to a variety of variables that are only known when the job starts, as well as start up some processes on the container (mostly for telemetry/logging retention).

mrstevegross
Contributor

Followup: I got the AWS creds working by amending our AWS role to permit read/write access to our S3 bucket. Woohoo!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now