Unable to Access S3 from Serverless but Works on Cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-10-2025 10:34 AM
Hi everyone,
I am trying to access data from S3 using an access key and secret. When I run the code through Databricks clusters, it works fine. However, when I try to do the same from a serverless cluster , I am unable to access the data.
I have already checked:
- The credentials (access key and secret) are correct.
- databricks scope
Could someone guide me on what additional configurations or steps I need to follow to make this work from a server?
Any help is greatly appreciated!
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-10-2025 12:22 PM
Hi @HarryRichard08 , Can you please provide the sample code you are using? And the error you get with serverless?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-11-2025 04:20 AM
Hi Karanam
Please find the syntax below
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-11-2025 02:38 PM
Hi @HarryRichard08, Databricks recommends using instance profiles (IAM roles) to connect to AWS S3 as they provide a secure and scalable method without embedding credentials in a notebook. Have you tried this approach?
https://docs.databricks.com/aws/en/connect/storage/tutorial-s3-instance-profile

