<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks External table row maximum size in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116160#M45254</link>
    <description>&lt;P&gt;Hi Denni, Just to cross confirm again. 2-2.5GB is per row right?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 22 Apr 2025 08:42:40 GMT</pubDate>
    <dc:creator>Mano99</dc:creator>
    <dc:date>2025-04-22T08:42:40Z</dc:date>
    <item>
      <title>Databricks External table row maximum size</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116121#M45245</link>
      <description>&lt;P&gt;Hi Databricks Team/ Community,&lt;BR /&gt;&lt;BR /&gt;We have created a Databricks External table on top of ADLS Gen 2. Both parquet and delta tables. we are loading nested json structure into a table. Few column will have huge nested json data. Im getting results too large error. But transformations and others are working fine. Only i cant able to display it.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Here, What i want to know is what is the maximum size(in terms of MB or GB) per row databricks can accept or it can store.&lt;BR /&gt;Saw some references in google and AI, they are saying upto 2.5GB. Is it true? If anyone knows the&amp;nbsp; exact number, please help here. And leave a comments on above known issue to understand better.&lt;BR /&gt;&lt;BR /&gt;Thanks &amp;amp; Regards,&lt;BR /&gt;Manohar G&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 22 Apr 2025 03:42:19 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116121#M45245</guid>
      <dc:creator>Mano99</dc:creator>
      <dc:date>2025-04-22T03:42:19Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks External table row maximum size</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116136#M45250</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/160177"&gt;@Mano99&lt;/a&gt;&amp;nbsp;&lt;A href="https://www-ktag.com/" target="_self"&gt;ktag&lt;/A&gt;wrote:&lt;BR /&gt;&lt;P&gt;Hi Databricks Team/ Community,&lt;BR /&gt;&lt;BR /&gt;We have created a Databricks External table on top of ADLS Gen 2. Both parquet and delta tables. we are loading nested json structure into a table. Few column will have huge nested json data. Im getting results too large error. But transformations and others are working fine. Only i cant able to display it.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Here, What i want to know is what is the maximum size(in terms of MB or GB) per row databricks can accept or it can store.&lt;BR /&gt;Saw some references in google and AI, they are saying upto 2.5GB. Is it true? If anyone knows the&amp;nbsp; exact number, please help here. And leave a comments on above known issue to understand better.&lt;BR /&gt;&lt;BR /&gt;Thanks &amp;amp; Regards,&lt;BR /&gt;Manohar G&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;Databricks/Spark can generally store rows up to around 2-2.5 GB, a practical limit due to underlying data structures. However, the "results too large" error you're seeing is a limitation on the driver node's ability to *display* large result sets, especially with huge nested JSON columns. To resolve this, avoid displaying the entire table directly; instead, use `.limit()`, filter for specific rows, project only necessary columns, sample the data, or write the data to a file for external analysis. The storage limit is separate from the display limitation.&lt;/P&gt;</description>
      <pubDate>Tue, 22 Apr 2025 06:28:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116136#M45250</guid>
      <dc:creator>dennis65</dc:creator>
      <dc:date>2025-04-22T06:28:50Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks External table row maximum size</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116160#M45254</link>
      <description>&lt;P&gt;Hi Denni, Just to cross confirm again. 2-2.5GB is per row right?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 22 Apr 2025 08:42:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-external-table-row-maximum-size/m-p/116160#M45254</guid>
      <dc:creator>Mano99</dc:creator>
      <dc:date>2025-04-22T08:42:40Z</dc:date>
    </item>
  </channel>
</rss>

