Hi. I'm trying to use managed MLflow with our own MinIO as an artifact storage. I can see that there is a description about storage options at landing page and there is an input for artifact store URI when creating empty experiment in databicks workspace UI. However, I can't find any documentation about it; the form of URI, how to set auth token for our MinIO, etc. I'm not sure even I can use external MinIO (or NFS, local file paths as described in landing page).
Please let me link related docs or let me know it is not possible.
If it is not possible, please update the feature section on landing page at https://www.databricks.com/product/managed-mlflow
ARTIFACT STORE: Store large files such as S3 buckets, shared NFS file system, and models in Amazon S3, Azure Blob Storage, Google Cloud Storage, SFTP server, NFS, and local file paths.