cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Is there a way to change the default artifact store path on Databricks Mlflow?

Eero_H
New Contributor

I have a cloud storage mounted to Databricks and I would like to store all of the model artifacts there without specifying it when creating a new experiment.

Is there a way to configure the Databricks workspace to save all of the model artifacts to a custom location in DBFS by default?

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

According to the Databricks MLflow documentation, you can specify a custom artifact location when creating a new experiment using the;

artifact_location parameter of the mlflow.create_experiment

This will override the default artifact location (/databricks/mlflow)

However, if you want to change the default artifact location for all experiments, you may need to set an environment variable called

MLFLOW_TRACKING_URI

to point to your desired cloud storage path before running any MLflow code. This will tell MLflow where to store and retrieve artifacts for all experiments.

Anonymous
Not applicable

Hi @Eero Hiltunen​ 

Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.

Please help us select the best solution by clicking on "Select As Best" if it does.

Your feedback will help us ensure that we are providing the best possible service to you. Thank you!

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!