Hello,
Trying to create a custom serving endpoint, using artifacts argument while logging the run/model to save .jar files. These files are called during when calling .predict.
JAVA runtime 8 or higher is required to run the jar file, not sure how to create a serving endpoint that will have JAVA runtime.
# Model wrapper class
#this model will give prob of all classes, prediction, predictio prob
class ModelWrapper_custom_DataRobot_To_Linesense(mlflow.pyfunc.PythonModel😞
# Initialize model in the constructor
def __init__(self, model😞
self.model = model
# Prediction function
def predict(self, context, model_input😞
model = ScoringCodeModel(context.artifacts['model_jar'])
model_output = model.predict(model_input)
df_temp = pd.DataFrame()
#Extract Probability values for selected prediction
df_temp['prediction_probability'] = model_output.max(axis=1)
# Find the column with maximum probability for each row
df_temp['prediction'] = model_output.idxmax(axis=1)
#remove NAN, this is required to avoid error in max find
df_temp = df_temp.dropna()
# Extract the middle value from the 'prediction' column
df_temp['prediction'] = df_temp['prediction'].apply(lambda x: x.split('_')[1])
#return
return df_temp.to_json(orient='records')
This is a simplified version of model wrapper, when serving endpoint is deployed it cannot infer due to java runtime missing.