cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

0: 'error: TypeError("\'NoneType\' object is not callable") in api_request_parallel_processor.py

marcelo2108
Contributor

Iยดm facing this exception after use mlflow.langchain.log_model and test the logged model using the following command

print(loaded_model.predict([{"query": "how does the performance of llama 2 compare to other local LLMs?"}]))


tasks failed. Errors: {0: 'error: TypeError("\'NoneType\' object is not callable") Traceback (most recent call last):\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/mlflow/langchain/api_request_parallel_processor.py", line 246, in call_api\n response = self.lc_model(\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 145, in warning_emitting_wrapper\n return wrapped(*args, **kwargs)\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py", line 383, in __call__\n return self.invoke(\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py", line 168, in invoke\n raise e\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py"

Iยดm using HuggingFace Transformers and chromaDb as vector search and logging model without errors as follows

import mlflow
import langchain
from mlflow.models import infer_signature

registered_model_name = "llama2-13b-retrievalqa-chain"

with mlflow.start_run() as run:
        signature = infer_signature(question, answer)
        logged_model = mlflow.langchain.log_model(
            rag_pipeline,
            artifact_path="chain",
            registered_model_name=registered_model_name,
            loader_fn=get_retriever,
            persist_dir=persist_directory,
            pip_requirements=["mlflow==" + mlflow.__version__,"langchain==" + langchain.__version__,"sentence_transformers","chromadb"],
            input_example=question,
            metadata={"task": "llm/v1/chat"},
            signature=signature,
            await_registration_for=900 # wait for 15 minutes for model registration to complete
        )

# Load the retrievalQA chain
loaded_model = mlflow.pyfunc.load_model(logged_model.model_uri)

Any clue about what this error means?

1 ACCEPTED SOLUTION

Accepted Solutions

marcelo2108
Contributor

I verified all steps @Retired_mod  and the objects and structure were looking good. As far as I understood on tests. Langchain Rag features such as RetrievalQA.from_chain_type does not work well with llm = HuggingFacePipeline instantiation steps. The problem happened when I used a logged the model (e.g notetype object is not callable). When I used llm=HuggingFaceHub way to instantiate the foundation model it worked fine. Both mlflow log model or serving the model in databricks.

 

View solution in original post

1 REPLY 1

marcelo2108
Contributor

I verified all steps @Retired_mod  and the objects and structure were looking good. As far as I understood on tests. Langchain Rag features such as RetrievalQA.from_chain_type does not work well with llm = HuggingFacePipeline instantiation steps. The problem happened when I used a logged the model (e.g notetype object is not callable). When I used llm=HuggingFaceHub way to instantiate the foundation model it worked fine. Both mlflow log model or serving the model in databricks.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group