cancel
Showing results for 
Search instead for 
Did you mean: 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results for 
Search instead for 
Did you mean: 

How we can choose the right LLM model, also considering the cost.

Amit_Dass_Chmp
New Contributor III

Hi All, 

I understand we are in the evaluation journey where we have 100+ LLMs (Open or Proprietary models ), but please guide us on how we can choose the right LLM model, also considering the cost. 

Do we have any benchmark to follow considering the use cases category.

Best Regards

Amit Dass

 

1 REPLY 1

mhiltner
Databricks Employee
Databricks Employee

Hey! 

It depends a lot on your application. Different LLMs excel in different functions. Proprietary models will be more expensive then Open Source ones, but they will likely lead you to a faster time to production with your applications. 

Also, if you are thinking about RAG or Fine Tuning a model, this also influences the model you choose. There are lighter models that takes less time to fine tune, and models that will work better with your latency requirements when working with RAGs. 

I recommend checking some benchmarks - in this case, you'll see benchmarks for maths, language understanding, etc. I also recommend checking out Hugging face's model page, where you can pick models based on their objective task and popularity. 

Finally, you can also leverage Databricks Playground to compare multiple models and how they respond to the same prompts. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group