- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-27-2022 11:05 AM
I am attempting to read data from an AWS access point but setting the Spark property as described here
Modifying the Spark properties
fs.s3a.finance-team-access.accesspoint.arn
arn:aws:s3:us-east-1:123456789012:accesspoint/finance-team-access
When I try to fetch data using dbutils.fs.ls or spark.read it says the bucket "finance-team-access" does not exist (I'm using the example access point from the documentation). It should have attempted.
I'm using databricks runtime 10.4 LTS.
Thank you
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-28-2022 01:41 PM
To answer my own question, the spark propertery is not required.
What is required is for you to use the access point alias, not the configured "name" or "arn" as detailed in the Spark documentation.
Read the access point https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-points-policies.html documents very carefully and make sure to add the access policy permission set, as well as the role that you have defined in your profile instance.
This should be a better documented feature in the AWS Databricks documentation.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-28-2022 01:41 PM
To answer my own question, the spark propertery is not required.
What is required is for you to use the access point alias, not the configured "name" or "arn" as detailed in the Spark documentation.
Read the access point https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-points-policies.html documents very carefully and make sure to add the access policy permission set, as well as the role that you have defined in your profile instance.
This should be a better documented feature in the AWS Databricks documentation.

