Iยดm facing this exception after use mlflow.langchain.log_model and test the logged model using the following command
print(loaded_model.predict([{"query": "how does the performance of llama 2 compare to other local LLMs?"}]))
tasks failed. Errors: {0: 'error: TypeError("\'NoneType\' object is not callable") Traceback (most recent call last):\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/mlflow/langchain/api_request_parallel_processor.py", line 246, in call_api\n response = self.lc_model(\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 145, in warning_emitting_wrapper\n return wrapped(*args, **kwargs)\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py", line 383, in __call__\n return self.invoke(\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py", line 168, in invoke\n raise e\n File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-ea7b0843-883a-4376-b604-6de66714bd87/lib/python3.10/site-packages/langchain/chains/base.py"
Iยดm using HuggingFace Transformers and chromaDb as vector search and logging model without errors as follows
import mlflow
import langchain
from mlflow.models import infer_signature
registered_model_name = "llama2-13b-retrievalqa-chain"
with mlflow.start_run() as run:
signature = infer_signature(question, answer)
logged_model = mlflow.langchain.log_model(
rag_pipeline,
artifact_path="chain",
registered_model_name=registered_model_name,
loader_fn=get_retriever,
persist_dir=persist_directory,
pip_requirements=["mlflow==" + mlflow.__version__,"langchain==" + langchain.__version__,"sentence_transformers","chromadb"],
input_example=question,
metadata={"task": "llm/v1/chat"},
signature=signature,
await_registration_for=900 # wait for 15 minutes for model registration to complete
)
# Load the retrievalQA chain
loaded_model = mlflow.pyfunc.load_model(logged_model.model_uri)
Any clue about what this error means?