We try to use MLflow Model Serving, this service will enable realtime model serving behind a REST API interface; it will launch a single-node cluster that will host our model.
The issue happens when the single-node cluster try to get the environment ready base on a conda.yaml file that created when log the model using MLflow. But it looks like I can only specify a pip install but not a Maven package.
conda_env = _mlflow_conda_env(
additional_conda_deps=None,
additional_pip_deps=["cloudpickle=={}".format(cloudpickle.
version), "scikit-learn=={}".format(sklearn.
version),"pyspark==3.0.0".format(pyspark.
version))],
additional_conda_channels=None,
)
how can I tell the cluster to install a maven jar file?