Connecting to an S3 compatible bucket

demo-user
New Contributor III

Hi everyone,

I’m trying to connect Databricks to an S3-compatible bucket using a custom endpoint URL and access keys.
I’m using an Express account with Serverless SQL Warehouses, but the only external storage options I see are AWS IAM roles or Cloudflare R2.
Is there any supported way to connect to a generic S3-compatible object store (via access key/secret + endpoint)? What is the workaround?

Raman_Unifeye
Honored Contributor III

@demo-user Serverless SQL Warehouses and the Express account type are designed for simplicity and rely almost exclusively on the cloud provider's secure identity mechanisms.

  • Serverless SQL Warehouses (and Serverless Jobs) do not currently support configuring generic S3-compatible endpoint URLs and access keys
  • Complex, non-standard cloud integrations are often limited or unsupported compared to the Enterprise account type

The only reliable and supported way to connect to a generic S3-compatible object store using a custom endpoint and access keys is by using Provisioned Compute where you control the Spark configuration.


RG #Driving Business Outcomes with Data Intelligence

Thank you for your response! Are Spark configurations enabled on Enterprise accounts with serverless compute? I am getting errors when I attempt the Spark configurations 

Raman_Unifeye
Honored Contributor III

Serverless compute does not support setting most Apache Spark configuration properties irrespective of Enterprise Tier as dB fully manages the underlying infrastructure.


RG #Driving Business Outcomes with Data Intelligence