cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

UC Model Deployment across data bricks instances

srkam
New Contributor

Hello, We have multiple data bricks instances each represents an environment dev,qa,rel,prod etc.. We developed a  model in the dev workspace and registered in the UC model registry using mlflow. Now, we are trying to find a best way to deploy this registered model into the target environments. We want to avoid rerun of the training pipeline in the target env, instead promote/copy/transition the registered model, its version and experiments into the target env.

Can you please help us on how to achieve this?

Thanks

1 REPLY 1

iyashk-DB
Databricks Employee
Databricks Employee

You can use UC's centralized model registry and MLflowโ€™s copy APIs.

If all target workspaces attach to the same Unity Catalog metastore, reference and promote models via their 3โ€‘level UC names; use MLflowโ€™s copy_model_version to โ€œcopyโ€ the exact artifacts from dev to qa/rel/prod, and manage deployment with aliases like Champion/Shadow. This avoids retraining and keeps one source of truth.
Ref Doc - https://docs.databricks.com/aws/en/machine-learning/manage-model-lifecycle

If environments run on different, isolated metastores/workspaces, use the community mlflow-export-import tooling to migrate registered models, versions, and experiments/runs between workspaces. This is the recommended way to copy MLflow objects (models, runs, experiments) across workspaces when UC sharing isnโ€™t possible.
Ref Doc - https://github.com/mlflow/mlflow-export-import

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now