cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure Databricks S3 External Location

tsmith-11
Databricks Partner

Hi,

I have recently created a new Azure Databricks account and several workspaces. I am needing to ingest data from an S3 bucket and am trying to follow the documentation detailed here:

https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-crede...
https://www.databricks.com/blog/announcing-general-availability-cross-cloud-data-governance 

When I go to create the IAM role credential though I don't see it as an option in the dropdown:

chrome_SAyP6JuECH.png

I have also tried running the SQL command of `CREATE STORAGE CREDENTIAL` but get a generic syntax error.

So far I have confirmed:

- I am a meta store admin

- The workspace is connected to unity catalog

- The serverless egress control is set to `Full`

- There is no option under `Feature Enablement` or `Previews`

- That the workspaces are `Premium`

- That the option is missing in all workspaces

- I am in a region that allows this feature

 

Was wondering if anyone else has encountered this or may have suggestions or if anyone has managed to configure it according to that documentation .

Thanks, 

1 ACCEPTED SOLUTION

Accepted Solutions

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @tsmith-11,

Having checked internally and from the screenshot, this doesn’t look like a configuration issue on your side but rather that the cross‑cloud S3 feature isn’t enabled on your Azure Databricks account/metastore yet. You should see an AWS IAM Role (read-only) option in the dropdown menu when it is enabled. 

Given that you have already validated all the prerequisites, your best option is to either ask your Databricks account team or raise a support ticket to check and confirm that the feature is enabled for your account, region, and metastore. You may want to include the workspae URL, region, metastrore and even the screenshot for them to investigate it.

Apparently, there is no easy workaround other than copying the data from S3 to ADLS first and then reading it from there, but I'm sure that will make it far too complicated.

If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***

View solution in original post

1 REPLY 1

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @tsmith-11,

Having checked internally and from the screenshot, this doesn’t look like a configuration issue on your side but rather that the cross‑cloud S3 feature isn’t enabled on your Azure Databricks account/metastore yet. You should see an AWS IAM Role (read-only) option in the dropdown menu when it is enabled. 

Given that you have already validated all the prerequisites, your best option is to either ask your Databricks account team or raise a support ticket to check and confirm that the feature is enabled for your account, region, and metastore. You may want to include the workspae URL, region, metastrore and even the screenshot for them to investigate it.

Apparently, there is no easy workaround other than copying the data from S3 to ADLS first and then reading it from there, but I'm sure that will make it far too complicated.

If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***