I'm a software engineer and a bit new to databricks. My goal is to create a model serving endpoint, that interfaces with several ML models. Traditionally this would look like:
API--> Service --> Data
Now using databricks, my understanding is that it will look like
Models Serving Endpoint --> Service Model --> ML Model
From a best practices perspective what is the best way to deploy? A single dab that bundles the resources to a single cluster? Multiple deployed models/clusters in more of a micro service fashion?
Also is the service model even necessary?
I can see benefits to each method. I'm certain there are aspects I'm overlooking. I'd love to hear how others are deploying