cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

Use on-premise MinIO as an artifact store in experiment

somnus
New Contributor II

Hi. I'm trying to use managed MLflow with our own MinIO as an artifact storage. I can see that there is a description about storage options at landing page and there is an input for artifact store URI when creating empty experiment in databicks workspace UI. However, I can't find any documentation about it; the form of URI, how to set auth token for our MinIO, etc. I'm not sure even I can use external MinIO (or NFS, local file paths as described in landing page).

Please let me link related docs or let me know it is not possible.

If it is not possible, please update the feature section on landing page at https://www.databricks.com/product/managed-mlflow

ARTIFACT STORE: Store large files such as S3 buckets, shared NFS file system, and models in Amazon S3, Azure Blob Storage, Google Cloud Storage, SFTP server, NFS, and local file paths.

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @somnusIf you want a new feature to be added here, you can request the feature here at this link.

somnus
New Contributor II

Thanks. I will post there if the feature I asked is doesn't exist.

Anyway, the feature I asked about is clearly described on the landing page. I'm looking for documentation for that feature.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.