<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Amazon returns a 403 error code when trying to access an S3 Bucket in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/40524#M27214</link>
    <description>&lt;P&gt;I had the same issue and I found a solution&lt;/P&gt;&lt;P&gt;For me, the permission problems only exist when the Cluster's (compute's) Access mode is "Shared No Isolation".&amp;nbsp; When the Access Mode is either "Shared" or "Single User" then the IAM configuration seems to apply as expected.&amp;nbsp; When set to "Shared No Isolation" it's as if the IAM settings are not being applied, and then a bunch of 403 errors are thrown&lt;BR /&gt;&lt;BR /&gt;Also, and this is interesting, the setting for "Instance Profile" can be either "None" or the ARN for the steps 6 described in the link below, it makes no difference.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;FONT color="#FF9900"&gt;&amp;nbsp;&lt;/FONT&gt;&lt;FONT color="#0000FF"&gt;&lt;A href="https://docs.databricks.com/en/aws/iam/instance-profile-tutorial.html" target="_blank" rel="noopener nofollow noreferrer"&gt;https://docs.databricks.com/en/aws/iam/instance-profile-tutorial.html&lt;/A&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
    <pubDate>Sat, 19 Aug 2023 00:51:04 GMT</pubDate>
    <dc:creator>winojoe</dc:creator>
    <dc:date>2023-08-19T00:51:04Z</dc:date>
    <item>
      <title>Amazon returns a 403 error code when trying to access an S3 Bucket</title>
      <link>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/11454#M6425</link>
      <description>&lt;P&gt;Hey! So far I have followed along with the &lt;A href="https://docs.databricks.com/aws/iam/instance-profile-tutorial.html" alt="https://docs.databricks.com/aws/iam/instance-profile-tutorial.html" target="_blank"&gt;&lt;B&gt;Configure S3 access with instance profiles&lt;/B&gt;&lt;/A&gt; article to grant my cluster access to an S3 bucket. I have also made sure to disable IAM role passthrough on the cluster. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Upon querying the bucket through a notebook using: &lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;dbutils.fs.ls("s3://&amp;lt;bucket-name&amp;gt;/")&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;I receive a 403: Access denied message back from Amazon. I've double checked and AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are not present in the spark environment variables. I've also checked through this article - &lt;A href="https://kb.databricks.com/en_US/security/forbidden-access-to-s3-data" target="test_blank"&gt;https://kb.databricks.com/en_US/security/forbidden-access-to-s3-data&lt;/A&gt; and have made sure to follow all the best practices.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Does anyone have any recommendations on what I should check or test?&lt;/P&gt;</description>
      <pubDate>Tue, 17 Jan 2023 22:15:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/11454#M6425</guid>
      <dc:creator>danatsafe</dc:creator>
      <dc:date>2023-01-17T22:15:33Z</dc:date>
    </item>
    <item>
      <title>Re: Amazon returns a 403 error code when trying to access an S3 Bucket</title>
      <link>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/11455#M6426</link>
      <description>&lt;P&gt;can you check the workspaces VPCs &lt;/P&gt;&lt;P&gt;route table if your using s3 gateway endpoint can you if the the gateway endpoint prefixlist is added explicitly to the workspace vpc subnets route table if it’s via traditional NAT/IG can you double triple check the route table gateway entries? If it’s s3 interface endpoints can you check if it’s appropriately tied to the workspaces vpc!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;all in double down cross check on the networking(subnets,sg,nacls,firewall if any) of the workspace vpc if not also check for the user access has any Denys via endpoint or iam , or bucket policies.&lt;/P&gt;</description>
      <pubDate>Fri, 24 Feb 2023 23:42:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/11455#M6426</guid>
      <dc:creator>User15848365773</dc:creator>
      <dc:date>2023-02-24T23:42:07Z</dc:date>
    </item>
    <item>
      <title>Re: Amazon returns a 403 error code when trying to access an S3 Bucket</title>
      <link>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/40387#M27191</link>
      <description>&lt;P&gt;Hi - having the same issue.&amp;nbsp; Just wondering if you were able to resolve it? If so, how?&lt;/P&gt;</description>
      <pubDate>Fri, 18 Aug 2023 05:21:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/40387#M27191</guid>
      <dc:creator>winojoe</dc:creator>
      <dc:date>2023-08-18T05:21:25Z</dc:date>
    </item>
    <item>
      <title>Re: Amazon returns a 403 error code when trying to access an S3 Bucket</title>
      <link>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/40524#M27214</link>
      <description>&lt;P&gt;I had the same issue and I found a solution&lt;/P&gt;&lt;P&gt;For me, the permission problems only exist when the Cluster's (compute's) Access mode is "Shared No Isolation".&amp;nbsp; When the Access Mode is either "Shared" or "Single User" then the IAM configuration seems to apply as expected.&amp;nbsp; When set to "Shared No Isolation" it's as if the IAM settings are not being applied, and then a bunch of 403 errors are thrown&lt;BR /&gt;&lt;BR /&gt;Also, and this is interesting, the setting for "Instance Profile" can be either "None" or the ARN for the steps 6 described in the link below, it makes no difference.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;FONT color="#FF9900"&gt;&amp;nbsp;&lt;/FONT&gt;&lt;FONT color="#0000FF"&gt;&lt;A href="https://docs.databricks.com/en/aws/iam/instance-profile-tutorial.html" target="_blank" rel="noopener nofollow noreferrer"&gt;https://docs.databricks.com/en/aws/iam/instance-profile-tutorial.html&lt;/A&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 19 Aug 2023 00:51:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/amazon-returns-a-403-error-code-when-trying-to-access-an-s3/m-p/40524#M27214</guid>
      <dc:creator>winojoe</dc:creator>
      <dc:date>2023-08-19T00:51:04Z</dc:date>
    </item>
  </channel>
</rss>

