cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

An error occurred while loading the model. Failed to load the pickled function from a hexadecimal

marcelo2108
Contributor

[8586fsbgpb] An error occurred while loading the model. Failed to load the pickled function from a hexadecimal string. Error: Can't get attribute 'transform_input' on <module '__main__' from '/opt/conda/envs/mlflow-env/bin/gunicorn'>.

I´m using the function to transform input and output on this way

def transform_input(**request):
    print('Type of prompt',type(request["prompt"]))
    request["messages"] = [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": request["prompt"]},
        ]
    request["stop"] = ['\n\n']
    print("Request format",request)
    return request

def transform_output(response):
    return response['candidates'][0]

# If using serving endpoint, the model serving endpoint is created in `02_[chat]_mlflow_logging_inference`
llm = Databricks(endpoint_name='llama2-7b-chat-completion',
                 transform_input_fn=transform_input,
                 transform_output_fn=transform_output,extra_params={"temperature":0.01,"max_tokens": 300})


Is there anything else I´m missing to avoid this error ?
1 ACCEPTED SOLUTION

Accepted Solutions

marcelo2108
Contributor

The solution I found was to create those functions in a separated python code called eg. custom_functions.py and deploy as follows in ml flow

with mlflow.start_run() as run:
        signature = infer_signature(question, answer)
        logged_model = mlflow.langchain.log_model(
            chain,
            artifact_path="chain",
            registered_model_name=registered_model_name,
            loader_fn=get_retriever,
            persist_dir=persist_directory,
            pip_requirements=["mlflow==" + mlflow.__version__,"langchain==" + langchain.__version__,"sentence_transformers","chromadb"],
            code_paths=["custom_functions.py"],
            #conda_env=conda_env,
            input_example=question,
            metadata={"task": "llm/v1/chat"},
            signature=signature,
            await_registration_for=900 # wait for 15 minutes for model registration to complete
        )

View solution in original post

3 REPLIES 3

marcelo2108
Contributor

Hi @Retired_mod I put already on top level of the cell script, exactly you mentioned as in the attachment file but no look. Should I put on the top of notebook ? Any other clue about ?

marcelo2108
Contributor

The solution I found was to create those functions in a separated python code called eg. custom_functions.py and deploy as follows in ml flow

with mlflow.start_run() as run:
        signature = infer_signature(question, answer)
        logged_model = mlflow.langchain.log_model(
            chain,
            artifact_path="chain",
            registered_model_name=registered_model_name,
            loader_fn=get_retriever,
            persist_dir=persist_directory,
            pip_requirements=["mlflow==" + mlflow.__version__,"langchain==" + langchain.__version__,"sentence_transformers","chromadb"],
            code_paths=["custom_functions.py"],
            #conda_env=conda_env,
            input_example=question,
            metadata={"task": "llm/v1/chat"},
            signature=signature,
            await_registration_for=900 # wait for 15 minutes for model registration to complete
        )

marcelo2108
Contributor

However I could not progress in the end I mean because I found the error I reported in other thread as follows

[5bb99fzs2f] An error occurred while loading the model. You haven't configured the CLI yet! Please configure by entering `/opt/conda/envs/mlflow-env/bin/gunicorn configure`.
As described in :

Re: Problem when serving a langchain model on Data... - Databricks - 59506

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group