cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Use on-premise MinIO as an artifact store in experiment

somnus
New Contributor II

Hi. I'm trying to use managed MLflow with our own MinIO as an artifact storage. I can see that there is a description about storage options at landing page and there is an input for artifact store URI when creating empty experiment in databicks workspace UI. However, I can't find any documentation about it; the form of URI, how to set auth token for our MinIO, etc. I'm not sure even I can use external MinIO (or NFS, local file paths as described in landing page).

Please let me link related docs or let me know it is not possible.

If it is not possible, please update the feature section on landing page at https://www.databricks.com/product/managed-mlflow

ARTIFACT STORE: Store large files such as S3 buckets, shared NFS file system, and models in Amazon S3, Azure Blob Storage, Google Cloud Storage, SFTP server, NFS, and local file paths.

1 REPLY 1

somnus
New Contributor II

Thanks. I will post there if the feature I asked is doesn't exist.

Anyway, the feature I asked about is clearly described on the landing page. I'm looking for documentation for that feature.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group