cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Possible to use `params` argument of `mlflow.pyfunc.PythonModel` deployed to Databricks endpoint?

dcunningham1
New Contributor

I'm deploying a custom model using the `mlflow.pyfunc.PythonModel` class as described here. My model uses the `params` argument in the `predict` method to allow the user to choose some aspects of the model at inference time. For example:

class CustomModelWrapper(mlflow.pyfunc.PythonModel):
    def __init__(self, model):
        self.model = model

    def predict(self, context, model_input, params):
confidence_level = params.get("confidence_level", 0.9)
        return self.model.predict(model_input, confidence_level=confidence_level)

 

The model works fine when I load it from the MLflow model registry. Passing in params works as expected.

But it does not work when I deploy the model to an endpoint on Databricks. It seems that Databricks does not support the `params` argument. That's unfortunate because it seems like it would be useful/important for many use cases.

Does Databricks have any plans to support the `params` argument in model serving endpoints in the future? In the meantime, is there any reasonable workaround to allow the user to specify parameters at inference time?

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group