cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Deploy mlflow model to Sagemaker

sanjay
Valued Contributor II

Hi,

I am trying to deploy mlflow model in Sagemaker. My mlflow model is registered in Databrick.

Followed below url to deploy and it need ECR for deployment. For ECR, either I can create custom image and push to ECR or its mentioned in below url to get myflow-pyfunc image url in ECR. "Contact your Databricks representative for an mlflow-pyfunc image URL in ECR."

Appreciate any help in building ECR image.

https://docs.databricks.com/en/archive/model-export/mleap-model-deployment-on-sagemaker.html

Regards,
Sanjay

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

 Hi @sanjayDeploying an MLflow model to Amazon SageMaker is a great way to scale your machine learning inference containers. MLflow simplifies the deployment process by providing easy-to-use commands without requiring you to write complex container definitions.

Here are the steps to deploy your MLflow model to SageMaker:

  1. Preparation:

    • Make sure you have the following tools installed:
      • MLflow
      • AWS CLI
      • Docker
    • Set up your AWS account and permissions correctly. Youโ€™ll need an IAM role with permissions to create a SageMaker endpoint, access an S3 bucket, and use the Elastic Container Registry (ECR). This role should also be assumable by the user performing the deployment.
  2. Create an MLflow Model:

    • Before deploying, ensure you have an MLflow Model. If you donโ€™t have one, you can create a sample scikit-learn model by following the MLflow Tracking Quickstart.
    • Note down the model URI, such as runs:/<run_id>/<artifact_path> (or models:/<model_name>/<model_version> if you registered the model in the MLflow Model Registry).
  3. Test Your Model Locally:

    • Itโ€™s recommended to test your model locally before deploying it to a production environment.
  4. Deploy the Model to SageMaker Endpoint:

    • Use the MLflow CLI to deploy your model:
      mlflow sagemaker deploy -m <model_uri> -n <endpoint_name> --region <aws_region>
      
      Replace <model_uri> with your actual model URI, <endpoint_name> with your desired endpoint name, and <aws_region> with the AWS region where you want to deploy the endpoint.
    • MLflow will build a Docker image from your MLflow Model, push it to ECR, and create a SageMaker endpoint using this image. It will also upload the model artifact to an S3 bucket and configure the endpoint to download the model from there.
  5. Access the SageMaker Endpoint:

    • The SageMaker endpoint provides REST endpoints for inference. For example, the /invocations endpoint accepts CSV and JSON input data and returns prediction results.

Remember that MLflow supports both the general pyfunc deployment (default) and the mleap flavor for SageMaker deployment. If you choose the mleap flavor, the endpoint will only accept JSON-serialized pandas DataFrames in the split orientation. You can specify this format using a Content-Type request header value of application/json.

For more details and Python API references, check out the MLflow documentation on deploying to SageMaker

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group