cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

0: 'error: TypeError("\'NoneType\' object is not callable") in api_request_parallel_processor.py

marcelo2108
Contributor

Iยดm facing this exception after use mlflow.langchain.log_model and test the logged model using the following command

print(loaded_model.predict([{"query": "how does the performance of llama 2 compare to other local LLMs?"}]))


tasks failed. Errors: {0: 'error: TypeError("\'NoneType\' object is not callable") Traceback (most recent call last):\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/mlflow/langchain/api_request_parallel_processor.py", line 246, in call_api\n response = self.lc_model(\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 145, in warning_emitting_wrapper\n return wrapped(*args, **kwargs)\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py", line 383, in __call__\n return self.invoke(\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py", line 168, in invoke\n raise e\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py"

Iยดm using HuggingFace Transformers and chromaDb as vector search and logging model without errors as follows

import mlflow
import langchain
from mlflow.models import infer_signature

registered_model_name = "llama2-13b-retrievalqa-chain"

with mlflow.start_run() as run:
        signature = infer_signature(question, answer)
        logged_model = mlflow.langchain.log_model(
            rag_pipeline,
            artifact_path="chain",
            registered_model_name=registered_model_name,
            loader_fn=get_retriever,
            persist_dir=persist_directory,
            pip_requirements=["mlflow==" + mlflow.__version__,"langchain==" + langchain.__version__,"sentence_transformers","chromadb"],
            input_example=question,
            metadata={"task": "llm/v1/chat"},
            signature=signature,
            await_registration_for=900 # wait for 15 minutes for model registration to complete
        )

# Load the retrievalQA chain
loaded_model = mlflow.pyfunc.load_model(logged_model.model_uri)

Any clue about what this error means?

1 ACCEPTED SOLUTION

Accepted Solutions

marcelo2108
Contributor

I verified all steps @Kaniz  and the objects and structure were looking good. As far as I understood on tests. Langchain Rag features such as RetrievalQA.from_chain_type does not work well with llm = HuggingFacePipeline instantiation steps. The problem happened when I used a logged the model (e.g notetype object is not callable). When I used llm=HuggingFaceHub way to instantiate the foundation model it worked fine. Both mlflow log model or serving the model in databricks.

 

View solution in original post

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @marcelo2108

 

Check the Model and Pipeline:

  • Ensure that your rag_pipeline and other relevant objects (such as get_retriever) are correctly initialized and loaded. Verify that they are not None.
  • Confirm that the rag_pipeline is a valid HuggingFace Transformers pipeline and that it has been trained or loaded successfully.

Inspect the mlflow.langchain.log_model Call:

  • Double-check the arguments passed to mlflow.langchain.log_model. Make sure that all required parameters are provided correctly.
  • Verify that the rag_pipeline and other relevant objects are valid and callable.

Version Compatibility:

  • Ensure that youโ€™re using compatible versions of MLflow, HuggingFace Transformers, and Langchain. Incompatibilities between library versions can lead to unexpected behavior.
  • Check if there are any known issues related to specific versions of these libraries. You can refer to their documentation or community forums for guidance.

 If you encounter any specific issues or need further assistance, feel free to share additional details, and Iโ€™ll be happy to assist! ๐Ÿš€

marcelo2108
Contributor

I verified all steps @Kaniz  and the objects and structure were looking good. As far as I understood on tests. Langchain Rag features such as RetrievalQA.from_chain_type does not work well with llm = HuggingFacePipeline instantiation steps. The problem happened when I used a logged the model (e.g notetype object is not callable). When I used llm=HuggingFaceHub way to instantiate the foundation model it worked fine. Both mlflow log model or serving the model in databricks.

 
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.