cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

MLFlow Tracking versions

ViliamG
New Contributor

Hi team,

we are migrating from self-self hosted MLFlow Tracking server to the Databricks-hosted one. However, there are concerns about the unclear process of version changes and releases at the Tracking server side. Is there any public  information available on changes done/planned regarding minor/major version updates ? So we can know what MLFlow client version is still compatible with the Tracking, and plan potential updates of the client accordingly.

Thanks!

1 REPLY 1

Louis_Frolio
Databricks Employee
Databricks Employee

Hey @ViliamG ,  thanks for raising thisโ€”hereโ€™s how versioning and client compatibility work for the Databricks-hosted MLflow Tracking service, and where you can track changes publicly.

Whatโ€™s publicly available about versions

  • The Databricks-hosted MLflow Tracking service is fully managed in your workspace, so thereโ€™s no server you need to maintain or patch yourself. Databricks calls out that the managed service is โ€œfully managedโ€ with automated updates for reliability and security.
  • For upstream changes, the MLflow open-source release notes are the canonical source of minor/major updates and breaking changes across MLflow modules (Tracking, Registry, Models, etc.).
  • Databricks publishes Databricks Runtime release notes with an explicit โ€œMLflowโ€“Databricks Runtime compatibility matrix,โ€ which shows the MLflow version bundled with each Runtime ML version. If you run code on Databricks compute, this is the most reliable way to know what MLflow client version youโ€™re using on-cluster and to plan upgrades.

Client โ†” tracking server compatibility

  • On a standard OSS MLflow Tracking Server, you can query the serverโ€™s version via the /version endpoint. This is useful for checking what the tracking server is running when you donโ€™t control the host.
  • There is currently no official clientโ€“server compatibility matrix published by MLflow that guarantees cross-version behavior between arbitrary client and server versions, which is why most teams align clients to their serverโ€™s major series and verify with smoke tests before rolling out deeper changes.
  • If youโ€™re logging from Databricks compute, the simplest and safest approach is to use the MLflow version bundled with your Databricks Runtime ML cluster (see the matrix in the Runtime release notes). That keeps client and server behavior aligned and reduces surprise during upgrades.
  • If you log from outside Databricks to the Databricks-hosted Tracking Server, set your tracking URI to Databricks and authenticate with your workspace host and token (for example, MLFLOW_TRACKING_URI=databricks, plus DATABRICKS_HOST and DATABRICKS_TOKEN). This is the supported way to use external clients against the Databricks-hosted service.

Recommended update planning approach

  • Standardize on a Databricks Runtime LTS for production workloads and align your client version to the MLflow version shown in the Runtime compatibility matrix for that LTS. This gives you a predictable upgrade cadence and longer support windows.
  • Track upstream changes via the MLflow release notes, and test new client versions against your tracking workflows (list/create experiments, log params/metrics/artifacts, and any Model Registry operations you rely on) before updating broadly.
  • When you need to support external clients, a practical rule of thumb is to keep clients within the same major MLflow series as your server and verify with smoke tests, especially if you use features that changed across majors (for example, REST endpoints or Registry semantics). Then roll out in stages to reduce risk. (Best practice; no official matrix is published.)

How to check your current versions

  • From a Databricks notebook/job (on-cluster), check the client version youโ€™re actually using: python import mlflow print(mlflow.__version__)
  • If you operate any OSS MLflow servers, check the server version directly: bash curl -sS http://<your-mlflow-server>:<port>/version
 
Hope this hel