I have a Service Principal (for M2M auth) with read access to a Databricks Model Registry. I can successfully search the registry (via the `WorkspaceClient`) and find the model that I want to load using (Python) APIs, but I cannot load the model for inference.
Loading for inference seems to require MLFlow, but each time I try to use `mlflow.MlflowClient` to load a model, it errors out with `InvalidConfigurationError: You haven't configured the CLI yet!`.
I need to be able to use M2M auth, and configuring the CLI is not possible in my workflow.
How can I use the M2M auth approach to load models with MLFlow, or is there a different model-loading workflow that needs to be followed?