cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Connecting an S3-compatible endpoint (such as MinIO) to Unity Catalog

demo-user
New Contributor II

Hi everyone, is it possible to connect an S3-compatible storage endpoint that is not AWS S3 (for example MinIO) to Databricks Unity Catalog? I already have access using Spark configurations (3a endpoint, access key, secret key, etc.), and I can read/write data at the Spark level. 

But is support for an S3-compatible storage endpoint available through Unity Catalog using access key/ secret key credentials rather than an IAM role?

2 REPLIES 2

MoJaMa
Databricks Employee
Databricks Employee

It's not supported unfortunately to register those as Storage Credentials. But the ask seems to be coming up more frequently and I believe Product is in "Discovery" phase to support it in UC. Here are some standard questions that might help with collecting evidence.

~Mohan Mathews.

  1. What use cases would be unblocked?
  2. What data volumes do you have in these storage systems?
  3. What workarounds have you tried? What makes this CUJ difficult?
  4. What competitive solutions are you considering?

demo-user
New Contributor II

Thank you for your help @MoJaMa 
We have a unified endpoint that is S3-compatible and connects to multiple data sources. Those data sources can be in the cloud or on-prem, and the location of the data doesnโ€™t matter to the user. The goal of the unified endpoint is multiple things: simplicity, unified access, and improved performance.

We can successfully connect today using s3a path-style access with access keys and secret keys, and we are able to read and write data at the Spark level. We now want to understand how our setup can be supported with Unity Catalog.