cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Problem serving a langchain model on Databricks

hawa
New Contributor II

Hi, I've encountered a problem of serving a langchain model I just created successfully on Databricks.

I was using the following code to set up a model in unity catalog:

from mlflow.models import infer_signature
import mlflow
import langchain

mlflow.set_registry_uri("databricks-uc")
model_name = "model1"

with mlflow.start_run(run_name="clippy_rag") as run:
    signature = infer_signature(question, answer)
    model_info = mlflow.langchain.log_model(
        chain,
        loader_fn=get_retriver,
        artifact_path="chain",
        registered_model_name=model_name,
        pip_requirements=[
            "mlflow==" + mlflow.__version__,
            "langchain==" + langchain.__version__,
            "databricks-vectorsearch",
        ],
        signature=signature,
    )
 
The UI shows that the model is ready but when I severed this model it showed Model with name 'model1' and version '1' is not successfully registered. Ensure model version has finished registration before use in model serving. Do you know what's the issue here?
1 REPLY 1

hawa
New Contributor II

I suspected the issue is coming from this small error I got: Got error: Must specify a chain Type in config. I used the 

chain_type="stuff" when building the langchain but I'm not sure how to fix it.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group