<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic [VARIANT_SIZE_LIMIT] Cannot build variant bigger than 16.0 MiB in parse_json in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/variant-size-limit-cannot-build-variant-bigger-than-16-0-mib-in/m-p/133563#M49880</link>
    <description>&lt;P&gt;I have a table coming from postgreSql, with one column containing json data in string format. We have been using parse_json to convert that to a vraiant column. But lately it is failing with the SIZE_LIMIT error.&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I isolated the row which gave error, the json string contains a an array of 75 items (with four fields each). when I&amp;nbsp; saved that data column as a text/json file it takes only 30 kb (disk space) .&lt;SPAN&gt;When I tried from_json function&amp;nbsp;with a pre defined schema , I was successfully able to convert to a StructType column.&lt;BR /&gt;&lt;BR /&gt;We like to keep the column as Variant, as we don't want it tied to a specific schema. Any suggestions.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 02 Oct 2025 18:26:00 GMT</pubDate>
    <dc:creator>bbastian</dc:creator>
    <dc:date>2025-10-02T18:26:00Z</dc:date>
    <item>
      <title>[VARIANT_SIZE_LIMIT] Cannot build variant bigger than 16.0 MiB in parse_json</title>
      <link>https://community.databricks.com/t5/data-engineering/variant-size-limit-cannot-build-variant-bigger-than-16-0-mib-in/m-p/133563#M49880</link>
      <description>&lt;P&gt;I have a table coming from postgreSql, with one column containing json data in string format. We have been using parse_json to convert that to a vraiant column. But lately it is failing with the SIZE_LIMIT error.&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I isolated the row which gave error, the json string contains a an array of 75 items (with four fields each). when I&amp;nbsp; saved that data column as a text/json file it takes only 30 kb (disk space) .&lt;SPAN&gt;When I tried from_json function&amp;nbsp;with a pre defined schema , I was successfully able to convert to a StructType column.&lt;BR /&gt;&lt;BR /&gt;We like to keep the column as Variant, as we don't want it tied to a specific schema. Any suggestions.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 02 Oct 2025 18:26:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/variant-size-limit-cannot-build-variant-bigger-than-16-0-mib-in/m-p/133563#M49880</guid>
      <dc:creator>bbastian</dc:creator>
      <dc:date>2025-10-02T18:26:00Z</dc:date>
    </item>
    <item>
      <title>Re: [VARIANT_SIZE_LIMIT] Cannot build variant bigger than 16.0 MiB in parse_json</title>
      <link>https://community.databricks.com/t5/data-engineering/variant-size-limit-cannot-build-variant-bigger-than-16-0-mib-in/m-p/133569#M49883</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/188479"&gt;@bbastian&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Unfortunately, as of now there is strict limitation regarding size - a&lt;SPAN&gt;&amp;nbsp;variant column cannot contain a value larger than 16 MiB.&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://docs.databricks.com/aws/en/delta/variant#limitations" target="_blank" rel="noopener"&gt;Variant support in Delta Lake | Databricks on AWS&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;And tbh you cannot compare the size of this json string to how it is represented internally in memory because there's a lot that is happening there (i.e JSON is tokenize into objects, arrays, keys, value).&lt;BR /&gt;&lt;BR /&gt;In your case you have 3 options:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;1. Store this columns as a JSON string.&amp;nbsp;&lt;SPAN&gt;You can store data in a single string column using standard JSON formatting and then query fields in the JSON using&amp;nbsp;&lt;STRONG&gt;:&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;notation. JSON string supports string of arbitrary length&lt;BR /&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;SPAN&gt;2. Use structs - the provide great performance on read but you lose flexibility (you need to define schema upfront)&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;SPAN&gt;3. Preprocess your data - it depends on your data, but you can try divide this array of items into 2 or 3 array and store them in separate variant columns. This way you can overcome 16Mib size limitation&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 02 Oct 2025 19:56:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/variant-size-limit-cannot-build-variant-bigger-than-16-0-mib-in/m-p/133569#M49883</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-10-02T19:56:52Z</dc:date>
    </item>
  </channel>
</rss>

