Is there a way to change the default artifact store path on Databricks Mlflow?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-07-2023 02:20 AM
I have a cloud storage mounted to Databricks and I would like to store all of the model artifacts there without specifying it when creating a new experiment.
Is there a way to configure the Databricks workspace to save all of the model artifacts to a custom location in DBFS by default?
- Labels:
-
Cloud Storage
-
Databricks Mlflow
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-07-2023 04:00 AM
According to the Databricks MLflow documentation, you can specify a custom artifact location when creating a new experiment using the;
artifact_location parameter of the mlflow.create_experiment
This will override the default artifact location (/databricks/mlflow)
However, if you want to change the default artifact location for all experiments, you may need to set an environment variable called
MLFLOW_TRACKING_URI
to point to your desired cloud storage path before running any MLflow code. This will tell MLflow where to store and retrieve artifacts for all experiments.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-31-2023 05:56 PM
Hi @Eero Hiltunen
Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.
Please help us select the best solution by clicking on "Select As Best" if it does.
Your feedback will help us ensure that we are providing the best possible service to you. Thank you!

