cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Determine exact location of MLflow model tracking and model registry files and the Backend Stores

ScyLukb
New Contributor

I would like to determine the exact location of:

1. MLflow model tracking files

2. Model registry files (with Workspace Model Registry)

as according to the documentation it is mentioned that: "All methods copy the model into a secure location managed by the Workspace Model Registry.", but no specific details are provided. Where can I find these details?

Additionally, I would like to figure out which Backend Stores and Artifact stores are configured in Azure Databricks for the Model Tracking and Registry. Where can I find this information?

Thanks a lot for any help provided.

1 REPLY 1

Louis_Frolio
Databricks Employee
Databricks Employee

Greetings @ScyLukb ,  You’re right that the docs say the Workspace Model Registry copies models to a “secure location” but don’t name it prominently. Here’s where those files actually live and how to discover the configured stores.

Locations of MLflow tracking and registry files

  • MLflow experiment tracking artifacts (default) are stored under MLflow-managed artifact storage at dbfs:/databricks/mlflow-tracking/<experiment-id>/<run-id>/artifacts/. If you don’t specify an artifact_location when creating the experiment, Databricks uses this default location. The path shape is also documented as /dbfs/databricks/mlflow-tracking/<experiment-id>/<run-id>/artifacts/ in the KB guidance for downloading artifacts. You can load models directly from “an MLflow-managed artifact storage path beginning with dbfs:/databricks/mlflow-tracking/”.
  • Workspace Model Registry artifacts (legacy) are stored under dbfs:/databricks/mlflow-registry. This is a read-only system path used by Databricks internal APIs and remains accessible even when DBFS root and mounts are disabled in Azure Databricks. Engineers also reference this path for workspace registry artifacts and note that direct dbutils.fs browsing is blocked and you should use the MLflow client to list/download instead. The registry docs state “All methods copy the model into a secure location managed by the Workspace Model Registry,” which refers to this managed DBFS system path.
  • Unity Catalog Model Registry artifacts (recommended) are stored in UC-governed cloud storage tied to your catalog/schema, and model versions are “finalized” after uploading model files to “its storage location.” UC uses managed storage locations defined for the metastore, catalog, or schema to govern where artifacts land and how access is controlled via storage credentials/external locations.

Where to find the configured Backend Store and Artifact Store

  • Backend Store (MLflow tracking server on Databricks): In Azure Databricks, the tracking server is a fully managed service; you don’t configure its database or endpoints yourself. Each workspace has its own dedicated managed MLflow tracking service, and you use the tracking URI “databricks”. There isn’t a UI page that exposes backend store details; it’s part of the platform.
  • Artifact Store (experiments):
    • In the UI, open your experiment’s details and click the info icon next to the experiment name; it shows the artifact_location for that experiment. If you didn’t set one, it will be the MLflow-managed default at dbfs:/databricks/mlflow-tracking/<experiment-id>. You can also choose a UC Volume path (dbfs:/Volumes/…) when creating the experiment so artifacts go into UC-governed storage.
  • Artifact Store (Workspace Model Registry):
    • Workspace registry model files are in dbfs:/databricks/mlflow-registry (read-only system path). To see the exact artifact path for a model version, query the Model Registry with the MLflow client or REST APIs (artifact path locations are intentionally not included in webhook payloads).
  • Artifact Store (Unity Catalog Model Registry):
    • UC model versions are written to the storage location of the catalog/schema containing the model; you can inspect storage locations by describing the catalog/schema or reviewing UC Storage Locations in the admin UI. The UC models documentation explicitly refers to uploading model version files to “its storage location,” and UC’s storage-location concept explains how these are governed.

How to query exact artifact paths programmatically

  • Download a registered model’s artifacts (works for UC or Workspace registries): ```python import mlflow
# Set to UC registry or Workspace registry as needed: # UC registry: mlflow.set_registry_uri("databricks-uc") # Workspace registry (legacy): # mlflow.set_registry_uri("databricks")
model_name = "prod.ml_team.iris_model" # for UC, use three-level name version = "1"
dst = "/local_disk0/model" mlflow.artifacts.download_artifacts(artifact_uri=f"models:/{model_name}/{version}", dst_path=dst) ```
  • Find experiment artifact root (default or custom): In the experiment details UI, click the info icon; it displays the artifact_location (for MLflow default, it will be dbfs:/databricks/mlflow-tracking/<experiment-id>).
  • Accessing artifacts in MLflow-managed paths: For MLflow-managed paths under dbfs:/databricks/mlflow-tracking, you must use MLflow client APIs to list/download artifacts; direct DBFS listing is intentionally blocked for permissions enforcement.

Extra notes

  • The text “All methods copy the model into a secure location managed by the Workspace Model Registry” in the Workspace Model Registry docs refers to the internal, read-only DBFS system path where registry copies live (dbfs:/databricks/mlflow-registry).
  • If you are starting fresh or can migrate, Databricks recommends using the Unity Catalog Model Registry over the Workspace Model Registry for governance, cross-workspace access, and lineage.
 
Hope this helps, Louis.