<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Unable to Read Data from S3 in Databricks (AWS Free Trial) in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107636#M42875</link>
    <description>&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;Hey Community, &lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;I recently signed up for a Databricks free trial on AWS and created a workspace using the quickstart method. After setting up my cluster and opening a notebook, I tried to read a Parquet file from S3 using:&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;spark.read.parquet("s3://&amp;lt;bucket-name&amp;gt;/path/")&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;However, I’m getting the following error:&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;Py4JJavaError: An error occurred while calling o408.parquet.
: java.nio.file.AccessDeniedException: s3://databricks-workspace-stack-d3546-bucket/parquet-samples: shaded.databricks.org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by AwsCredentialContextTokenProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@fd37933: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@6d3127a1: User does not have any IAM roles]
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:249)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:197)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:4141)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:4067)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:3947)
	at com.databricks.common.filesystem.LokiS3FS.getFileStatusNoCache(LokiS3FS.scala:84)
	at com.databricks.common.filesystem.LokiS3FS.getFileStatus(LokiS3FS.scala:74)
	at com.databricks.common.filesystem.LokiFileSystem.getFileStatus(LokiFileSystem.scala:272)
	at org.apache.hadoop.fs.FileSystem.isDirectory(FileSystem.java:1880)
	at org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:60)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:416)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:389)
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:345)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:345)
	at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:866)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)
	at py4j.Gateway.invoke(Gateway.java:306)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: shaded.databricks.org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by AwsCredentialContextTokenProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@fd37933: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@6d3127a1: User does not have any IAM roles]
	at shaded.databricks.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:239)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:845)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:794)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5456)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5403)
	at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1372)
	at shaded.databricks.org.apache.hadoop.fs.s3a.EnforcingDatabricksS3Client.getObjectMetadata(EnforcingDatabricksS3Client.scala:222)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$6(S3AFileSystem.java:2364)
	at shaded.databricks.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:435)
	at shaded.databricks.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:394)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2354)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2322)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:4122)
	... 25 more
Caused by: com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@fd37933: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@6d3127a1: User does not have any IAM roles]
	at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:136)
	at com.databricks.backend.daemon.driver.aws.AwsCredentialContextTokenProvider.getCredentials(AwsCredentialContextTokenProvider.scala:84)
	at shaded.databricks.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:194)
	... 44 more
File &amp;lt;command-7886099615157681&amp;gt;&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;I’ve already checked my AWS IAM role, and it has all necessary S3 permissions. Could someone help me troubleshoot this issue?&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;Thanks in advance!&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 29 Jan 2025 16:42:47 GMT</pubDate>
    <dc:creator>messiah</dc:creator>
    <dc:date>2025-01-29T16:42:47Z</dc:date>
    <item>
      <title>Unable to Read Data from S3 in Databricks (AWS Free Trial)</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107636#M42875</link>
      <description>&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;Hey Community, &lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;I recently signed up for a Databricks free trial on AWS and created a workspace using the quickstart method. After setting up my cluster and opening a notebook, I tried to read a Parquet file from S3 using:&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;spark.read.parquet("s3://&amp;lt;bucket-name&amp;gt;/path/")&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;However, I’m getting the following error:&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;Py4JJavaError: An error occurred while calling o408.parquet.
: java.nio.file.AccessDeniedException: s3://databricks-workspace-stack-d3546-bucket/parquet-samples: shaded.databricks.org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by AwsCredentialContextTokenProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@fd37933: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@6d3127a1: User does not have any IAM roles]
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:249)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:197)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:4141)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:4067)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:3947)
	at com.databricks.common.filesystem.LokiS3FS.getFileStatusNoCache(LokiS3FS.scala:84)
	at com.databricks.common.filesystem.LokiS3FS.getFileStatus(LokiS3FS.scala:74)
	at com.databricks.common.filesystem.LokiFileSystem.getFileStatus(LokiFileSystem.scala:272)
	at org.apache.hadoop.fs.FileSystem.isDirectory(FileSystem.java:1880)
	at org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:60)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:416)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:389)
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:345)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:345)
	at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:866)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)
	at py4j.Gateway.invoke(Gateway.java:306)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: shaded.databricks.org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by AwsCredentialContextTokenProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@fd37933: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@6d3127a1: User does not have any IAM roles]
	at shaded.databricks.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:239)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:845)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:794)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5456)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5403)
	at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1372)
	at shaded.databricks.org.apache.hadoop.fs.s3a.EnforcingDatabricksS3Client.getObjectMetadata(EnforcingDatabricksS3Client.scala:222)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$6(S3AFileSystem.java:2364)
	at shaded.databricks.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:435)
	at shaded.databricks.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:394)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2354)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2322)
	at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:4122)
	... 25 more
Caused by: com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [com.databricks.backend.daemon.driver.aws.AwsLocalCredentialContextTokenProvider@fd37933: No role specified and no roles available., com.databricks.backend.daemon.driver.aws.ProxiedIAMCredentialProvider@6d3127a1: User does not have any IAM roles]
	at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:136)
	at com.databricks.backend.daemon.driver.aws.AwsCredentialContextTokenProvider.getCredentials(AwsCredentialContextTokenProvider.scala:84)
	at shaded.databricks.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:194)
	... 44 more
File &amp;lt;command-7886099615157681&amp;gt;&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;I’ve already checked my AWS IAM role, and it has all necessary S3 permissions. Could someone help me troubleshoot this issue?&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;Thanks in advance!&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 29 Jan 2025 16:42:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107636#M42875</guid>
      <dc:creator>messiah</dc:creator>
      <dc:date>2025-01-29T16:42:47Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to Read Data from S3 in Databricks (AWS Free Trial)</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107666#M42882</link>
      <description>&lt;P&gt;hey&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/146713"&gt;@messiah&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Probably the problem is in the cluster configuration&lt;/P&gt;&lt;P class=""&gt;When using a &lt;STRONG&gt;Shared Cluster&lt;/STRONG&gt; in Databricks, the &lt;STRONG&gt;Instance Profile assigned to the cluster will not be used&lt;/STRONG&gt; to authenticate access to AWS resources like S3. This is because Shared Clusters operate in a &lt;STRONG&gt;multi-user environment&lt;/STRONG&gt;, where permissions and credentials are tied to the individual user rather than the cluster itself.&lt;/P&gt;&lt;P class=""&gt;To work around this limitation, the best approach is to use &lt;STRONG&gt;External Locations&lt;/STRONG&gt; if your workspace is enabled for &lt;STRONG&gt;Unity Catalog&lt;/STRONG&gt;. External Locations allow administrators to define and manage access to cloud storage at the Unity Catalog level, ensuring that users can read and write data securely without needing direct access to AWS credentials.&lt;/P&gt;&lt;P class=""&gt;If Unity Catalog is not available or External Locations are not an option, a simple and effective alternative is to use a &lt;STRONG&gt;Single-User Cluster&lt;/STRONG&gt; instead of a Shared Cluster. Single-User Clusters operate in an isolated environment where all commands run under the same user identity, allowing the assigned &lt;STRONG&gt;Instance Profile to be applied correctly&lt;/STRONG&gt;. This means the cluster will have seamless access to AWS resources without requiring additional authentication mechanisms.&lt;/P&gt;&lt;P class=""&gt;By leveraging &lt;STRONG&gt;External Locations&lt;/STRONG&gt; where possible or switching to &lt;STRONG&gt;Single-User Clusters&lt;/STRONG&gt;, you can avoid authentication issues while ensuring secure and efficient access to cloud storage in Databricks.&lt;BR /&gt;&lt;BR /&gt;Hope that helps &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 29 Jan 2025 22:05:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107666#M42882</guid>
      <dc:creator>Isi</dc:creator>
      <dc:date>2025-01-29T22:05:34Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to Read Data from S3 in Databricks (AWS Free Trial)</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107685#M42886</link>
      <description>&lt;P&gt;Hey &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/145555"&gt;@Isi&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm using a single-node cluster with DBR 14.3 LTS, and Unity Catalog is enabled. I also tried DBR 15.x LTS, but the issue persists.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I set up the workspace using the quickstart method, which created four Databricks roles—two for Lambda, one for EC2 instance setup, and one for S3. These roles are linked to my workspace. The cluster configuration shows "None" for the instance profile.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I also tried enabling credential passthrough, but nothing worked.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there any other fix for this?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jan 2025 03:27:19 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107685#M42886</guid>
      <dc:creator>messiah</dc:creator>
      <dc:date>2025-01-30T03:27:19Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to Read Data from S3 in Databricks (AWS Free Trial)</title>
      <link>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107736#M42910</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/146713"&gt;@messiah&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;DIV class="p-field_section p-field_section--stacked"&gt;
&lt;DIV class="p-mrkdwn_element"&gt;&lt;SPAN data-qa="bk_markdown_element"&gt;This occurs due to the lack of AWS credentials or IAM roles necessary to access the S3 bucket. &lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV class="p-mrkdwn_element"&gt;&lt;SPAN data-qa="bk_markdown_element"&gt;Can you please check the&amp;nbsp;&lt;STRONG data-stringify-type="bold"&gt;AWS Credentials,&amp;nbsp;IAM Roles&lt;/STRONG&gt;&lt;SPAN data-stringify-type="bold"&gt;&amp;nbsp;and&lt;/SPAN&gt;&lt;STRONG data-stringify-type="bold"&gt;&amp;nbsp;IAM Permissions&lt;/STRONG&gt;: Make sure the IAM role associated with the instance profile has...&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;DIV class="p-field_section p-field_section--stacked"&gt;
&lt;DIV class="p-mrkdwn_element"&gt;&lt;SPAN data-qa="bk_markdown_element"&gt;...the necessary permissions to access the S3 bucket, such as&amp;nbsp;&lt;CODE class="c-mrkdwn__code" data-stringify-type="code"&gt;s3:ListBucket&lt;/CODE&gt;&amp;nbsp;and&amp;nbsp;&lt;CODE class="c-mrkdwn__code" data-stringify-type="code"&gt;s3:GetObject&lt;/CODE&gt;.&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;/DIV&gt;</description>
      <pubDate>Thu, 30 Jan 2025 08:39:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/unable-to-read-data-from-s3-in-databricks-aws-free-trial/m-p/107736#M42910</guid>
      <dc:creator>Sidhant07</dc:creator>
      <dc:date>2025-01-30T08:39:05Z</dc:date>
    </item>
  </channel>
</rss>

