<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Azure Databricks S3 External Location in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/azure-databricks-s3-external-location/m-p/151375#M53637</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/221443"&gt;@tsmith-11&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Having checked internally and from the screenshot, this doesn’t look like a configuration issue on your side but rather that the cross‑cloud S3 feature isn’t enabled on your Azure Databricks account/metastore yet. You should see an AWS IAM Role (read-only) option in the dropdown menu when it is enabled.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Given that you have already validated all the prerequisites, your best option is to either as&lt;SPAN&gt;k your Databricks account team or raise a support ticket to check and confirm that the feature&amp;nbsp;is enabled for your account, region, and&lt;/SPAN&gt;&amp;nbsp;metastore&lt;SPAN&gt;. You may want to include the workspae URL, region, metastrore and even the screenshot for them to investigate it.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;Apparently, there is no easy workaround other than copying the data from S3 to ADLS first and then reading it from there, but I'm sure that will make it far too complicated.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 19 Mar 2026 09:23:15 GMT</pubDate>
    <dc:creator>Ashwin_DSA</dc:creator>
    <dc:date>2026-03-19T09:23:15Z</dc:date>
    <item>
      <title>Azure Databricks S3 External Location</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-s3-external-location/m-p/151359#M53632</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I have recently created a new Azure Databricks account and several workspaces. I am needing to ingest data from an S3 bucket and am trying to follow the documentation detailed here:&lt;/P&gt;&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials-s3" target="_blank" rel="noopener"&gt;https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials-s3&lt;/A&gt;&lt;BR /&gt;&lt;A href="https://www.databricks.com/blog/announcing-general-availability-cross-cloud-data-governance" target="_blank" rel="noopener"&gt;https://www.databricks.com/blog/announcing-general-availability-cross-cloud-data-governance&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I go to create the IAM role credential though I don't see it as an option in the dropdown:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="chrome_SAyP6JuECH.png" style="width: 638px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/24992iD3F19FD9C2757864/image-size/large?v=v2&amp;amp;px=999" role="button" title="chrome_SAyP6JuECH.png" alt="chrome_SAyP6JuECH.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;I have also tried running the SQL command of `&lt;SPAN&gt;CREATE&lt;/SPAN&gt; &lt;SPAN&gt;STORAGE&lt;/SPAN&gt; &lt;SPAN&gt;CREDENTIAL` but get a generic syntax error.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;So far I have confirmed:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- I am a meta store admin&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- The workspace is connected to unity catalog&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- The serverless egress control is set to `Full`&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- There is no option under `Feature Enablement` or `Previews`&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- That the workspaces are `Premium`&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- That the option is missing in all workspaces&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- I am in a region that allows this feature&lt;/SPAN&gt;&lt;/P&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;Was wondering if anyone else has encountered this or may have suggestions or if anyone has managed to configure it according to that documentation .&lt;/P&gt;&lt;P&gt;Thanks, &lt;/P&gt;</description>
      <pubDate>Thu, 19 Mar 2026 07:28:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-s3-external-location/m-p/151359#M53632</guid>
      <dc:creator>tsmith-11</dc:creator>
      <dc:date>2026-03-19T07:28:55Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks S3 External Location</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-s3-external-location/m-p/151375#M53637</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/221443"&gt;@tsmith-11&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Having checked internally and from the screenshot, this doesn’t look like a configuration issue on your side but rather that the cross‑cloud S3 feature isn’t enabled on your Azure Databricks account/metastore yet. You should see an AWS IAM Role (read-only) option in the dropdown menu when it is enabled.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Given that you have already validated all the prerequisites, your best option is to either as&lt;SPAN&gt;k your Databricks account team or raise a support ticket to check and confirm that the feature&amp;nbsp;is enabled for your account, region, and&lt;/SPAN&gt;&amp;nbsp;metastore&lt;SPAN&gt;. You may want to include the workspae URL, region, metastrore and even the screenshot for them to investigate it.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;Apparently, there is no easy workaround other than copying the data from S3 to ADLS first and then reading it from there, but I'm sure that will make it far too complicated.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 19 Mar 2026 09:23:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-s3-external-location/m-p/151375#M53637</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-03-19T09:23:15Z</dc:date>
    </item>
  </channel>
</rss>

