<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: LLM with the largest context window in Generative AI</title>
    <link>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/118255#M868</link>
    <description>&lt;P&gt;Hey royinblr11,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Where did this question come from and when was it published? You are correct in that the latest DBRX model has a 32k token context window, larger than MPT-30B's 8k context window. Our latest publication on this stat was March 2024. If the question was published before then, it might be out of date.&lt;/P&gt;
&lt;P&gt;&lt;A href="https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm" target="_blank"&gt;https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm&lt;/A&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 07 May 2025 15:44:32 GMT</pubDate>
    <dc:creator>sarahbhord</dc:creator>
    <dc:date>2025-05-07T15:44:32Z</dc:date>
    <item>
      <title>LLM with the largest context window</title>
      <link>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/117860#M867</link>
      <description>&lt;P&gt;&lt;SPAN&gt;A Generative AI Engineer is tasked with developing an application that is based on an open-source large language model (LLM). They need a foundation LLM with a large context window. Which model fits this need?&lt;/SPAN&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;DBRX,&lt;/LI&gt;&lt;LI&gt;Llama2-70B,&lt;/LI&gt;&lt;LI&gt;DistilBert&amp;nbsp;&lt;/LI&gt;&lt;LI&gt;MPT-30B.&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;SPAN&gt;DBRX has a larger context window compared to MPT-30B.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;DBRX has a 32K token context window, while MPT-30B has an 8k token context window but the Answer has mentioned as MPT-30B. Can anyone please help here? Thanks in advance.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 06 May 2025 09:42:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/117860#M867</guid>
      <dc:creator>royinblr11</dc:creator>
      <dc:date>2025-05-06T09:42:18Z</dc:date>
    </item>
    <item>
      <title>Re: LLM with the largest context window</title>
      <link>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/118255#M868</link>
      <description>&lt;P&gt;Hey royinblr11,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Where did this question come from and when was it published? You are correct in that the latest DBRX model has a 32k token context window, larger than MPT-30B's 8k context window. Our latest publication on this stat was March 2024. If the question was published before then, it might be out of date.&lt;/P&gt;
&lt;P&gt;&lt;A href="https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm" target="_blank"&gt;https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 07 May 2025 15:44:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/118255#M868</guid>
      <dc:creator>sarahbhord</dc:creator>
      <dc:date>2025-05-07T15:44:32Z</dc:date>
    </item>
    <item>
      <title>Re: LLM with the largest context window</title>
      <link>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/118346#M869</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/163213"&gt;@royinblr11&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;You're absolutely right to question the answer — the &lt;STRONG&gt;correct model for an application needing a foundation LLM with a large context window&lt;/STRONG&gt; is:&amp;nbsp;&lt;STRONG&gt;DBRX&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Why DBRX is the Best Fit:&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;It is a &lt;STRONG&gt;foundation model&lt;/STRONG&gt;, designed for &lt;STRONG&gt;generation tasks&lt;/STRONG&gt;.&lt;/LI&gt;&lt;LI&gt;It supports a &lt;STRONG&gt;32K token context window&lt;/STRONG&gt; out of the box — ideal for handling long documents, chats, or code.&lt;/LI&gt;&lt;LI&gt;It is open-source and production-ready for &lt;STRONG&gt;enterprise-level generative AI tasks&lt;/STRONG&gt;.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;So, if the goal is:&lt;/P&gt;&lt;P&gt;Build a &lt;STRONG&gt;generative application&lt;/STRONG&gt; using an &lt;STRONG&gt;open-source&lt;/STRONG&gt; model with a &lt;STRONG&gt;large context window&lt;/STRONG&gt;,&lt;/P&gt;&lt;P&gt;Then the &lt;STRONG&gt;best answer is DBRX&lt;/STRONG&gt;, not MPT-30B.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 07 May 2025 20:38:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/generative-ai/llm-with-the-largest-context-window/m-p/118346#M869</guid>
      <dc:creator>lingareddy_Alva</dc:creator>
      <dc:date>2025-05-07T20:38:17Z</dc:date>
    </item>
  </channel>
</rss>

