cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

mount bucket s3

thiagoawstest
Contributor

Hi, I have Databricks configured on AWS, I need to mount some S3 buckets on Databricks in /mnt, but I have some questions:

- How can a bucket be mounted for all clusters and users to have access to, so as not to need to mount it every time the cluster starts?

- and there is a way to mount it without creating an access key in AWS, what is the best practice for a production environment?

thanks.

1 ACCEPTED SOLUTION

Accepted Solutions

Yeshwanth
Databricks Employee
Databricks Employee

@thiagoawstest To mount an S3 bucket in Databricks on AWS so that all clusters and users have access to it without needing to remount each time, and without creating an access key in AWS, follow these steps: 

Mounting an S3 Bucket Using an AWS Instance Profile

1. Configure your cluster with an instance profile:
   - Ensure your AWS instance profile has the necessary permissions to access the S3 bucket.
   - Attach the instance profile to your Databricks cluster.

2. Mount the S3 bucket:
   - Use the dbutils.fs.mount command to mount the S3 bucket. The mount point will be accessible to all users and clusters, and it will persist across cluster restarts.

Example in Python:

python
aws_bucket_name = "<aws-bucket-name>"
mount_name = "<mount-name>"

dbutils.fs.mount(
    source=f"s3a://{aws_bucket_name}",
    mount_point=f"/mnt/{mount_name}"
)

# Verify the mount
display(dbutils.fs.ls(f"/mnt/{mount_name}"))

Reference: https://docs.databricks.com/en/dbfs/mounts.html#mount-a-bucket-using-an-aws-instance-profile

View solution in original post

1 REPLY 1

Yeshwanth
Databricks Employee
Databricks Employee

@thiagoawstest To mount an S3 bucket in Databricks on AWS so that all clusters and users have access to it without needing to remount each time, and without creating an access key in AWS, follow these steps: 

Mounting an S3 Bucket Using an AWS Instance Profile

1. Configure your cluster with an instance profile:
   - Ensure your AWS instance profile has the necessary permissions to access the S3 bucket.
   - Attach the instance profile to your Databricks cluster.

2. Mount the S3 bucket:
   - Use the dbutils.fs.mount command to mount the S3 bucket. The mount point will be accessible to all users and clusters, and it will persist across cluster restarts.

Example in Python:

python
aws_bucket_name = "<aws-bucket-name>"
mount_name = "<mount-name>"

dbutils.fs.mount(
    source=f"s3a://{aws_bucket_name}",
    mount_point=f"/mnt/{mount_name}"
)

# Verify the mount
display(dbutils.fs.ls(f"/mnt/{mount_name}"))

Reference: https://docs.databricks.com/en/dbfs/mounts.html#mount-a-bucket-using-an-aws-instance-profile

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group