cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

LLM with the largest context window

royinblr11
New Contributor II

A Generative AI Engineer is tasked with developing an application that is based on an open-source large language model (LLM). They need a foundation LLM with a large context window. Which model fits this need?

  1. DBRX,
  2. Llama2-70B,
  3. DistilBert 
  4. MPT-30B.

DBRX has a larger context window compared to MPT-30B. DBRX has a 32K token context window, while MPT-30B has an 8k token context window but the Answer has mentioned as MPT-30B. Can anyone please help here? Thanks in advance.

1 ACCEPTED SOLUTION

Accepted Solutions

sarahbhord
Databricks Employee
Databricks Employee

Hey royinblr11, 

Where did this question come from and when was it published? You are correct in that the latest DBRX model has a 32k token context window, larger than MPT-30B's 8k context window. Our latest publication on this stat was March 2024. If the question was published before then, it might be out of date.

https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

View solution in original post

2 REPLIES 2

sarahbhord
Databricks Employee
Databricks Employee

Hey royinblr11, 

Where did this question come from and when was it published? You are correct in that the latest DBRX model has a 32k token context window, larger than MPT-30B's 8k context window. Our latest publication on this stat was March 2024. If the question was published before then, it might be out of date.

https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

lingareddy_Alva
Honored Contributor III

@royinblr11 

You're absolutely right to question the answer โ€” the correct model for an application needing a foundation LLM with a large context window is: DBRX

Why DBRX is the Best Fit:

  • It is a foundation model, designed for generation tasks.
  • It supports a 32K token context window out of the box โ€” ideal for handling long documents, chats, or code.
  • It is open-source and production-ready for enterprise-level generative AI tasks.

So, if the goal is:

Build a generative application using an open-source model with a large context window,

Then the best answer is DBRX, not MPT-30B.

 

LR

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now