- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-28-2022 06:11 AM
Dear team,
We have several AWS accounts with S3 buckets, the databricks setup is on our dev AWS account and we would like to allow instance profile to have read permission on all our S3 buckets on the other AWS accounts ( without using bucket policy which require us to add it on any bucket)
I am trying using the assume role but dosent work, getting access denied.
It is working only if i set S3 bucket permissions on my other/remote AWS account bucket policy.
Please advise
Thansks!
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-28-2022 11:09 AM
Can you please share the IAM role policy in the secondary account (Bucket account) ?
Just wanted to know have you tried setting the config in the cluster.
spark.hadoop.fs.s3a.bucket.<s3-bucket-name>.aws.credentials.provider org.apache.hadoop.fs.s3a.auth.AssumedRoleCredentialProvider
spark.hadoop.fs.s3a.bucket.<s3-bucket-name>.assumed.role.arn arn:aws:iam::<bucket-owner-account-id>:role/Master_Role
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-28-2022 11:09 AM
Can you please share the IAM role policy in the secondary account (Bucket account) ?
Just wanted to know have you tried setting the config in the cluster.
spark.hadoop.fs.s3a.bucket.<s3-bucket-name>.aws.credentials.provider org.apache.hadoop.fs.s3a.auth.AssumedRoleCredentialProvider
spark.hadoop.fs.s3a.bucket.<s3-bucket-name>.assumed.role.arn arn:aws:iam::<bucket-owner-account-id>:role/Master_Role
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2022 12:59 AM
Thank you @D Raj Kumar
Added it and now its works!
Thanks

