cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Identifying the right tools for the job

UM
New Contributor II

Hi all, thank you for taking the time to attend to my post. A background to preface, my team and I have been prototyping an ML model that we would like to push into the production and deployment phase. We have been prototyping on Jupyter Notebooks but are trying to figure out what tools and platforms we may require.

I'm by no means an expert in data architecture and was hoping someone could shed some light. In the diagram attached, is my understanding of it thus far (implying that something like Databricks and Sagemaker is an all in one platform) However, I'm not sure if they have functionalities like:

  • version control
  • parameterization of the notebooks
  • templatizing notebooks untitled

Hope to hear back! And participate in an active discussion! Thank you so much!

1 ACCEPTED SOLUTION

Accepted Solutions

Dan_Z
Databricks Employee
Databricks Employee

For production model serving, why not just use MLflow Model Serving? You just code it up/import it with the notebooks, then Log it using MLflow, then Register it with the MLflow Registry, then there is a nice UI to serve it using Model Serving. It will expose a REST endpoint for your model that any application can hit.

View solution in original post

2 REPLIES 2

-werners-
Esteemed Contributor III

It is hard to tell what you need. Are you gonna use spark?

Dan_Z
Databricks Employee
Databricks Employee

For production model serving, why not just use MLflow Model Serving? You just code it up/import it with the notebooks, then Log it using MLflow, then Register it with the MLflow Registry, then there is a nice UI to serve it using Model Serving. It will expose a REST endpoint for your model that any application can hit.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group