โ11-10-2021 08:50 PM
I wanted to deploy a registered model present in databricks managed MLFlow to a sagemaker via databricks notebook?
As of now, it is not able to run mlflow sagemaker build-and-push container command directly. What all configurations or steps needed to do that? I assume that a manual push of docker image from outside of databricks should not be required just like in open source MLFlow. There has to be an alternate way.
Also, When I am trying to test it locally via API, then I am getting the below error.
Code:
import mlflow.sagemaker as mfs
mfs.run_local(model_uri=model_uri,port=8000,image="test")
Error:
AttributeError: 'ConsoleBuffer' object has no attribute 'fileno'
Can someone show some light on this topic?
โ01-13-2022 10:59 PM
@Atanu Sarkarโ @Gobinath Viswanathanโ @Kaniz Fatmaโ :
I have been trying to push a registered model in DB managed mlflow to sagemaker endpoint. Although I have been able to do it but there are some manual steps that I needed to do on my local system in order to make it work. Could you help me to understand, Am I doing it correctly or Is there a bug in the Databricks ML runtime.
Below are the steps that I did:
So, I downloaded the model files into a folder on DB local path using mlflow library.
I have checked that this error comes when the docker daemon processes itself are not working. I also havenโt been able to find any docker process executable file in โ/etc/init.d/โ path where general service executables are present.
The only way all things works is when I downloaded all model based files on my local system, ran the docker desktop for docker daemons to be up and then ran โmlflow sagemaker build-and-push-containerโ command from inside the models folder. It had created an image in the ECR which is being correctly referred by โmlflow.sagemaker.deployโ command.
My question is that, Is this the right process? Do we need to build the image locally in order to make it work?
My assumption was that the โmlflow.sagemaker.deployโ command would be able to take care of all things Or atmost the โmlflow sagemaker build-and-push-containerโ command should be able to run from databricks notebook itself.
โ11-16-2021 09:12 PM
@Kaniz Fatmaโ : Thanks for replying.
I am using databricks runtime 9 ML.
Open source MLFlow implementation is working fine but I am getting error while running mlflow sagemaker command on top of databricks notebooks.
โ11-23-2021 09:15 PM
โ11-23-2021 09:17 PM
@Saurabh Vermaโ Please try!
import mlflow.sagemaker as mfs
sys.stdout.fileno = lambda: 0
mfs.run_local(model_uri=model_uri,port=8000,image="test")
โ11-24-2021 10:55 PM
@Gobinath Viswanathanโ : Still getting below error:
Have tried to install docker explicitly too but the error still persists.
Note: I am running this inside Databricks notebook on managed AWS databricks.
Error:
Using the python_function flavor for local serving!
2021/11/24 13:01:07 INFO mlflow.sagemaker: launching docker image with path /tmp/tmpq622qyl6/model
2021/11/24 13:01:07 INFO mlflow.sagemaker: executing: docker run -v /tmp/tmpq622qyl6/model:/opt/ml/model/ -p 5432:8080 -e MLFLOW_DEPLOYMENT_FLAVOR_NAME=python_function --rm test serve
FileNotFoundError: [Errno 2] No such file or directory: 'docker'
โ11-27-2021 07:06 AM
https://docs.docker.com/engine/reference/builder/
https://forums.docker.com/t/no-such-file-or-directory-after-building-the-image/66143
this 2 references might be helpful from docker side. let us know if this helps . @Saurabh Vermaโ
โ11-28-2021 08:54 PM
@Atanu Sarkarโ @Gobinath Viswanathanโ @Kaniz Fatmaโ : Thanks for reaching out. Unfortunately the above links mentioned by you is working only in case I am doing things via open-source MLFlow where I have control over editing the folder structure and create a separate docker file.
But the same is not allowed in managed Databricks env. The model artefacts are stored on a path which can only be assessed by MLFLow API.
In case, if you have tried some other way and it worked for you then please let me know the complete steps. I am looking to push the models registered in Databricks managed MLFLow registry to the Sagemaker endpoints and also wanted to test this setup via Sagemaker local command.
โ01-03-2022 04:53 AM
โ01-11-2022 05:37 AM
@Kaniz Fatmaโ @Gobinath Viswanathanโ @Atanu Sarkarโ :
Hi All,
The above direct methods are not working. So, I downloaded the model files using mlflow and trying to run "mlflow sagemaker build-and-push-containers" in order to push the model image to ECR.
This step is also not running. Getting "no module named docker" error from the "mlflow.models.docker_utils" module
I am currently running Databricks 10.2 ML runtime.
After installing docker via "pip install docker". Now I am getting error as
"docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))"
โ01-13-2022 10:59 PM
@Atanu Sarkarโ @Gobinath Viswanathanโ @Kaniz Fatmaโ :
I have been trying to push a registered model in DB managed mlflow to sagemaker endpoint. Although I have been able to do it but there are some manual steps that I needed to do on my local system in order to make it work. Could you help me to understand, Am I doing it correctly or Is there a bug in the Databricks ML runtime.
Below are the steps that I did:
So, I downloaded the model files into a folder on DB local path using mlflow library.
I have checked that this error comes when the docker daemon processes itself are not working. I also havenโt been able to find any docker process executable file in โ/etc/init.d/โ path where general service executables are present.
The only way all things works is when I downloaded all model based files on my local system, ran the docker desktop for docker daemons to be up and then ran โmlflow sagemaker build-and-push-containerโ command from inside the models folder. It had created an image in the ECR which is being correctly referred by โmlflow.sagemaker.deployโ command.
My question is that, Is this the right process? Do we need to build the image locally in order to make it work?
My assumption was that the โmlflow.sagemaker.deployโ command would be able to take care of all things Or atmost the โmlflow sagemaker build-and-push-containerโ command should be able to run from databricks notebook itself.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group