cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to deploy a databricks managed workspace model to sagemaker from databricks notebook

Maverick1
Valued Contributor II

I wanted to deploy a registered model present in databricks managed MLFlow to a sagemaker via databricks notebook?

As of now, it is not able to run mlflow sagemaker build-and-push container command directly. What all configurations or steps needed to do that? I assume that a manual push of docker image from outside of databricks should not be required just like in open source MLFlow. There has to be an alternate way.

Also, When I am trying to test it locally via API, then I am getting the below error.

Code:

import mlflow.sagemaker as mfs

mfs.run_local(model_uri=model_uri,port=8000,image="test")

Error:

AttributeError: 'ConsoleBuffer' object has no attribute 'fileno'

Can someone show some light on this topic?

1 ACCEPTED SOLUTION

Accepted Solutions

@Atanu Sarkar​  @Gobinath Viswanathan​ @Kaniz Fatma​ :

I have been trying to push a registered model in DB managed mlflow to sagemaker endpoint. Although I have been able to do it but there are some manual steps that I needed to do on my local system in order to make it work. Could you help me to understand, Am I doing it correctly or Is there a bug in the Databricks ML runtime.

Below are the steps that I did:

  1. Step1: Log the model 
    1. Ran a model code and registered it in mlflow. Moved the model into production stage.
  2. Step2: Deploy the model
    1. Installed AWS CLI (via pip) and configured the AWS target env./account. This account have a role ARN setup with Sagemaker full access and ECRContainerRegistry full access.
    2. Able to connect the target AWS account via databricks notebook.
    3. While I was deploying the model as Sagemaker endpoint via “mlflow.sagemaker.deploy”, All intermediate obejcts are being created but I was getting the error because it was not able to find the container image in ECR. My initial assumption was that the function itself should be able to create containers by using the current model code.

So, I  downloaded the model files into a folder on DB local path using mlflow library.

  1. Now In order to create a container, I am using “mlflow sagemaker build-and-push-container” command from the DB local path where model files are present.It is showing me error “no module named docker” from the “mlflow.docker_utils” module.
  2. In order to resolve this I did “pip install docker”. But after that I am getting the error: docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))"

                                

I have checked that this error comes when the docker daemon processes itself are not working. I also haven’t been able to find any docker process executable file in “/etc/init.d/” path where general service executables are present.

The only way all things works is when I downloaded all model based files on my local system, ran the docker desktop for docker daemons to be up and then ran “mlflow sagemaker build-and-push-container” command from inside the models folder. It had created an image in the ECR which is being correctly referred by “mlflow.sagemaker.deploy” command.

My question is that, Is this the right process? Do we need to build the image locally in order to make it work?  

My assumption was that the “mlflow.sagemaker.deploy” command would be able to take care of all things Or atmost the “mlflow sagemaker build-and-push-container” command should be able to run from databricks notebook itself.

View solution in original post

14 REPLIES 14

Kaniz
Community Manager
Community Manager

Hi @Saurabh Verma​ ! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

Kaniz
Community Manager
Community Manager

Hi @Saurabh Verma​ , This link might help you.

https://www.mlflow.org/docs/latest/models.html#sagemaker-deployment

Kaniz
Community Manager
Community Manager

Also can you please tell me which databricks runtime are you using?

Maverick1
Valued Contributor II

@Kaniz Fatma​ : Thanks for replying.

I am using databricks runtime 9 ML.

Open source MLFlow implementation is working fine but I am getting error while running mlflow sagemaker command on top of databricks notebooks.

User16871418122
Contributor III

There apparently exists a simple solution. You can add sys.stdout.fileno = lambda: 0 to resolve the issue.

The issue was hit with ray library as well:

Screenshot 2021-11-24 at 10.43.34 AM 

You can add sys.stdout.fileno = lambda: 0 to resolve the issue.

Screenshot 2021-11-24 at 10.43.40 AM 

User16871418122
Contributor III

@Saurabh Verma​ Please try!

import mlflow.sagemaker as mfs
sys.stdout.fileno = lambda: 0
mfs.run_local(model_uri=model_uri,port=8000,image="test")

@Gobinath Viswanathan​ : Still getting below error:

Have tried to install docker explicitly too but the error still persists.

Note: I am running this inside Databricks notebook on managed AWS databricks.

Error:

Using the python_function flavor for local serving!

2021/11/24 13:01:07 INFO mlflow.sagemaker: launching docker image with path /tmp/tmpq622qyl6/model

2021/11/24 13:01:07 INFO mlflow.sagemaker: executing: docker run -v /tmp/tmpq622qyl6/model:/opt/ml/model/ -p 5432:8080 -e MLFLOW_DEPLOYMENT_FLAVOR_NAME=python_function --rm test serve

FileNotFoundError: [Errno 2] No such file or directory: 'docker'

Atanu
Esteemed Contributor
Esteemed Contributor

https://docs.docker.com/engine/reference/builder/

https://forums.docker.com/t/no-such-file-or-directory-after-building-the-image/66143

this 2 references might be helpful from docker side. let us know if this helps . @Saurabh Verma​ 

Maverick1
Valued Contributor II

@Atanu Sarkar​  @Gobinath Viswanathan​  @Kaniz Fatma​ : Thanks for reaching out. Unfortunately the above links mentioned by you is working only in case I am doing things via open-source MLFlow where I have control over editing the folder structure and create a separate docker file.

But the same is not allowed in managed Databricks env. The model artefacts are stored on a path which can only be assessed by MLFLow API.

In case, if you have tried some other way and it worked for you then please let me know the complete steps. I am looking to push the models registered in Databricks managed MLFLow registry to the Sagemaker endpoints and also wanted to test this setup via Sagemaker local command.

Kaniz
Community Manager
Community Manager

Hi @Saurabh Verma​ , Can you please add this code and check if it works?

import sys
 
sys.stdout.fileno = lambda: False

Maverick1
Valued Contributor II

@Kaniz Fatma​ : Hi Kaniz,

The suggested solution is not working on databricks notebooks. Please see below:

error_snap 

Maverick1
Valued Contributor II

@Kaniz Fatma​ @Gobinath Viswanathan​ @Atanu Sarkar​ :

Hi All,

The above direct methods are not working. So, I downloaded the model files using mlflow and trying to run "mlflow sagemaker build-and-push-containers" in order to push the model image to ECR.

This step is also not running. Getting "no module named docker" error from the "mlflow.models.docker_utils" module

I am currently running Databricks 10.2 ML runtime.

image 

After installing docker via "pip install docker". Now I am getting error as

"docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))"

image 

@Atanu Sarkar​  @Gobinath Viswanathan​ @Kaniz Fatma​ :

I have been trying to push a registered model in DB managed mlflow to sagemaker endpoint. Although I have been able to do it but there are some manual steps that I needed to do on my local system in order to make it work. Could you help me to understand, Am I doing it correctly or Is there a bug in the Databricks ML runtime.

Below are the steps that I did:

  1. Step1: Log the model 
    1. Ran a model code and registered it in mlflow. Moved the model into production stage.
  2. Step2: Deploy the model
    1. Installed AWS CLI (via pip) and configured the AWS target env./account. This account have a role ARN setup with Sagemaker full access and ECRContainerRegistry full access.
    2. Able to connect the target AWS account via databricks notebook.
    3. While I was deploying the model as Sagemaker endpoint via “mlflow.sagemaker.deploy”, All intermediate obejcts are being created but I was getting the error because it was not able to find the container image in ECR. My initial assumption was that the function itself should be able to create containers by using the current model code.

So, I  downloaded the model files into a folder on DB local path using mlflow library.

  1. Now In order to create a container, I am using “mlflow sagemaker build-and-push-container” command from the DB local path where model files are present.It is showing me error “no module named docker” from the “mlflow.docker_utils” module.
  2. In order to resolve this I did “pip install docker”. But after that I am getting the error: docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))"

                                

I have checked that this error comes when the docker daemon processes itself are not working. I also haven’t been able to find any docker process executable file in “/etc/init.d/” path where general service executables are present.

The only way all things works is when I downloaded all model based files on my local system, ran the docker desktop for docker daemons to be up and then ran “mlflow sagemaker build-and-push-container” command from inside the models folder. It had created an image in the ECR which is being correctly referred by “mlflow.sagemaker.deploy” command.

My question is that, Is this the right process? Do we need to build the image locally in order to make it work?  

My assumption was that the “mlflow.sagemaker.deploy” command would be able to take care of all things Or atmost the “mlflow sagemaker build-and-push-container” command should be able to run from databricks notebook itself.

Kaniz
Community Manager
Community Manager

Hi @Saurabh Verma​ , Yes, it's the right process. Thanks.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.