cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

When using MLflow tracking, where does it store the tracked parameters, metrics and artifacts?

Anonymous
Not applicable

I saw default path for artifacts as dbfs but not sure if that's where everything else is stored. Can we modify it?

1 ACCEPTED SOLUTION

Accepted Solutions

sean_owen
Databricks Employee
Databricks Employee

Artifacts like models, model metadata like the "MLmodel" file, input samples, and other logged artifacts like plots, config, network architectures, are stored as files. While these could be simple local filesystem files when the tracking server is run as a standalone service, typically (as in the case of Databricks's hosted MLflow) they are stored on distributed storage.

The location is determined by the Experiment being logged to, which could be configured to write to any mounted storage. By default in Databricks, it logs to a secured path in the root bucket which is protected by ACLs.

Metadata like params, tags, metrics, notes are logged into a database underpinning the MLflow tracking server, which could be most standard databases. In Databricks that is managed in the control plane.

View solution in original post

1 REPLY 1

sean_owen
Databricks Employee
Databricks Employee

Artifacts like models, model metadata like the "MLmodel" file, input samples, and other logged artifacts like plots, config, network architectures, are stored as files. While these could be simple local filesystem files when the tracking server is run as a standalone service, typically (as in the case of Databricks's hosted MLflow) they are stored on distributed storage.

The location is determined by the Experiment being logged to, which could be configured to write to any mounted storage. By default in Databricks, it logs to a secured path in the root bucket which is protected by ACLs.

Metadata like params, tags, metrics, notes are logged into a database underpinning the MLflow tracking server, which could be most standard databases. In Databricks that is managed in the control plane.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group