Hi everyone,
I am using databricks and mlflow to create a model and then register it as a serving endpoint. Sometimes the models takes more than 2 minutes to run and after 2 minutes it gives a timeout error:
Timed out while evaluating the model. Verify that the model evaluates within the timeout.
These are the three parameters I have identified which I think would help solve the issue:
MLFLOW_SCORING_SERVER_REQUEST_TIMEOUT
MLFLOW_REQUIREMENTS_INFERENCE_TIMEOUT
MLFLOW_HTTP_REQUEST_TIMEOUT
But I can't seem to find out where to include these in the code. The MLFLow documentation doesn't seem to help.
I have tried to include the parameters in the conda env but that doesn't help.