Why This Error Occurs
-
Streaming interface: The MLflow model must implement the predict_stream
method (using MLflow’s LLM/ChatModel interface).
-
Model registration: If you saved your model with MLflow but did not use an LLM/ChatModel wrapper that supports streaming, only standard prediction will work; streaming will fail.
-
Correct save: The model in MLflow must be saved using a method/class that exposes the streaming endpoint, not just the standard predict endpoint.
How to Resolve
1. Use a Supported ChatModel With Streaming
Ensure you are using an MLflow ChatModel implementation that supports streaming, e.g. OpenAI, Databricks MosaicML, or similar. When saving the model, use mlflow.langchain.save_model()
or similar, specifying the appropriate class that includes the streaming method.
2. Implement Streaming in Your Model
-
Your ChatModel class (or whichever class is wrapped for MLflow model serving) should have a predict_stream
method implemented.
-
In LangChain and LangGraph settings, ensure the LLM object supports streaming (set stream=True
and use classes/interfaces that yield partial outputs).
3. Register and Deploy the Streaming Model
-
Save the model using the appropriate MLflow saving function that retains the streaming capabilities.
-
When registering/deploying, the model artifact must expose predict_stream
.
4. Check Your Deployment Code
When deploying the agent, ensure your inference endpoint is properly configured to use the streaming schema per the latest MLflow documentation.
Example: MLflow Streaming ChatModel
import mlflow
from mlflow.langchain import save_model
from langchain.chat_models import ChatOpenAI
# Setup your LLM with streaming enabled
llm = ChatOpenAI(temperature=0.1, streaming=True)
# Save model using MLflow
save_model(
llm,
path="llm_model_streaming",
mlflow_model_flavor="langchain"
)
-
Ensure the ChatModel (ChatOpenAI
, MosaicML, etc.) supports streaming out of the box and is saved with that capability.
References to the Official Docs
The official [Databricks agent streaming guide], and MLflow ChatModel/Streaming documentation: confirm the streaming interface is present and properly implemented when you save and subsequently deploy the model.
Key Steps to Fix
-
Verify that your saving function in MLflow (e.g. save_model()
) saves a streaming-capable ChatModel.
-
Re-register the model in MLflow after confirming that the underlying implementation is compatible with streaming.
-
Update deployment code or configs to use the streaming endpoint (predict_stream
).
If the underlying LLM class or deployment does not support streaming, you must swap to a compatible class and redeploy.
Table: Error Cause and Resolution
Cause |
Resolution |
Model lacks predict_stream method |
Save with streaming ChatModel |
Wrong MLflow save function or model class |
Use mlflow.langchain.save_model |
LLM streaming not enabled in config |
Set stream=True in LLM params |
Implement these corrections, re-save and deploy your MLflow model, and the streaming output should work for your agent in Databricks.