Hi,
We're currently setting up Databricks Unity Catalog on AWS. We created an S3 bucket and assigned an IAM role (databricks-storage-role) to give Databricks access.
Note: Databricks doesn't use the IAM role directly. Instead, it requires a Storage Credential explicitly linked to this IAM role.
❗️ Issue
While trying to create a Databricks workspace (via the UI), it prompts for Unity Catalog configuration.
However, upon attempting to use the already configured Storage Credential, we receive the following error (screenshot is attached to the message):
PERMISSION_DENIED: AWS IAM role does not have READ permissions on url s3://databricks-workspace-storage-eu-west-2/unity-catalog/***************.
Cause: 403 Forbidden error from cloud storage provider.
✅ What We've Done
1. S3 Bucket Created: databricks-workspace-storage-eu-west-2
2. IAM Role (databricks-storage-role):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<DATABRICKS_ACCOUNT>:root"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"aws:PrincipalTag/DatabricksAccountId": "<DATABRICKS_ACCOUNT_ID>"
}
}
}
]
}
3. IAM Policy (databricks-storage-policy):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:ListBucket", "s3:GetBucketLocation"],
"Resource": "arn:aws:s3:::databricks-workspace-storage-eu-west-2"
},
{
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:GetObjectVersion", "s3:DeleteObject"],
"Resource": [
"arn:aws:s3:::databricks-workspace-storage-eu-west-2/*",
"arn:aws:s3:::databricks-workspace-storage-eu-west-2/unity-catalog/*"
]
},
{
"Effect": "Allow",
"Action": ["s3:PutObject", "s3:PutObjectAcl"],
"Resource": [
"arn:aws:s3:::databricks-workspace-storage-eu-west-2/*",
"arn:aws:s3:::databricks-workspace-storage-eu-west-2/unity-catalog/*"
],
"Condition": {
"StringEquals": {
"s3:x-amz-acl": "bucket-owner-full-control"
}
}
}
]
}
4. Bucket Policy (databricks-workspace-storage-eu-west-2):
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "GrantDatabricksRootAccess",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<DATABRICKS_ACCOUNT>:root"
},
"Action": [
"s3:GetObject", "s3:GetObjectVersion", "s3:PutObject",
"s3:DeleteObject", "s3:ListBucket", "s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::databricks-workspace-storage-eu-west-2",
"arn:aws:s3:::databricks-workspace-storage-eu-west-2/*"
]
},
{
"Sid": "AllowUnityCatalogAccessFromRole",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<MY_ACCOUNT>:role/databricks-storage-role"
},
"Action": [
"s3:GetObject", "s3:PutObject", "s3:DeleteObject",
"s3:ListBucket", "s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::databricks-workspace-storage-eu-west-2",
"arn:aws:s3:::databricks-workspace-storage-eu-west-2/*"
]
}
]
}
❓ Questions
What is the correct way to allow Databricks access to an S3 bucket via Storage Credential?
What could I be missing, even if policies seem fully configured?
Should I pre-create the specific unity-catalog/<ID> prefix in the bucket?
Any advice is appreciated!