cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Using AWS access points

marcus1
New Contributor III

I am attempting to read data from an AWS access point but setting the Spark property as described here

https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html#Configuring_S3_AccessPo...

Modifying the Spark properties

fs.s3a.finance-team-access.accesspoint.arn

arn:aws:s3:us-east-1:123456789012:accesspoint/finance-team-access

When I try to fetch data using dbutils.fs.ls or spark.read it says the bucket "finance-team-access" does not exist (I'm using the example access point from the documentation). It should have attempted.

I'm using databricks runtime 10.4 LTS.

Thank you

1 ACCEPTED SOLUTION

Accepted Solutions

marcus1
New Contributor III

To answer my own question, the spark propertery is not required.

What is required is for you to use the access point alias, not the configured "name" or "arn" as detailed in the Spark documentation.

Read the access point https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-points-policies.html documents very carefully and make sure to add the access policy permission set, as well as the role that you have defined in your profile instance.

This should be a better documented feature in the AWS Databricks documentation.

View solution in original post

1 REPLY 1

marcus1
New Contributor III

To answer my own question, the spark propertery is not required.

What is required is for you to use the access point alias, not the configured "name" or "arn" as detailed in the Spark documentation.

Read the access point https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-points-policies.html documents very carefully and make sure to add the access policy permission set, as well as the role that you have defined in your profile instance.

This should be a better documented feature in the AWS Databricks documentation.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group