@thiagoawstest To mount an S3 bucket in Databricks on AWS so that all clusters and users have access to it without needing to remount each time, and without creating an access key in AWS, follow these steps:
Mounting an S3 Bucket Using an AWS Instance Profile
1. Configure your cluster with an instance profile:
- Ensure your AWS instance profile has the necessary permissions to access the S3 bucket.
- Attach the instance profile to your Databricks cluster.
2. Mount the S3 bucket:
- Use the dbutils.fs.mount
command to mount the S3 bucket. The mount point will be accessible to all users and clusters, and it will persist across cluster restarts.
Example in Python:
python
aws_bucket_name = "<aws-bucket-name>"
mount_name = "<mount-name>"
dbutils.fs.mount(
source=f"s3a://{aws_bucket_name}",
mount_point=f"/mnt/{mount_name}"
)
# Verify the mount
display(dbutils.fs.ls(f"/mnt/{mount_name}"))
Reference: https://docs.databricks.com/en/dbfs/mounts.html#mount-a-bucket-using-an-aws-instance-profile