cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

Is there a way to change the default artifact store path on Databricks Mlflow?

Eero_H
New Contributor

I have a cloud storage mounted to Databricks and I would like to store all of the model artifacts there without specifying it when creating a new experiment.

Is there a way to configure the Databricks workspace to save all of the model artifacts to a custom location in DBFS by default?

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

According to the Databricks MLflow documentation, you can specify a custom artifact location when creating a new experiment using the;

artifact_location parameter of the mlflow.create_experiment

This will override the default artifact location (/databricks/mlflow)

However, if you want to change the default artifact location for all experiments, you may need to set an environment variable called

MLFLOW_TRACKING_URI

to point to your desired cloud storage path before running any MLflow code. This will tell MLflow where to store and retrieve artifacts for all experiments.

Anonymous
Not applicable

Hi @Eero Hiltunen​ 

Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.

Please help us select the best solution by clicking on "Select As Best" if it does.

Your feedback will help us ensure that we are providing the best possible service to you. Thank you!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.