Connecting to an S3 compatible bucket
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2025 10:30 AM
Hi everyone,
I’m trying to connect Databricks to an S3-compatible bucket using a custom endpoint URL and access keys.
I’m using an Express account with Serverless SQL Warehouses, but the only external storage options I see are AWS IAM roles or Cloudflare R2.
Is there any supported way to connect to a generic S3-compatible object store (via access key/secret + endpoint)? What is the workaround?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2025 03:07 PM
@demo-user Serverless SQL Warehouses and the Express account type are designed for simplicity and rely almost exclusively on the cloud provider's secure identity mechanisms.
- Serverless SQL Warehouses (and Serverless Jobs) do not currently support configuring generic S3-compatible endpoint URLs and access keys
- Complex, non-standard cloud integrations are often limited or unsupported compared to the Enterprise account type
The only reliable and supported way to connect to a generic S3-compatible object store using a custom endpoint and access keys is by using Provisioned Compute where you control the Spark configuration.
RG #Driving Business Outcomes with Data Intelligence
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2025 03:19 PM
Thank you for your response! Are Spark configurations enabled on Enterprise accounts with serverless compute? I am getting errors when I attempt the Spark configurations
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2025 03:28 PM
Serverless compute does not support setting most Apache Spark configuration properties irrespective of Enterprise Tier as dB fully manages the underlying infrastructure.
RG #Driving Business Outcomes with Data Intelligence