Get Size of a column in Bytes for a Pyspark Data frame
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2020 11:49 AM
Hello All,
I have a column in a dataframe which i struct type.I want to find the size of the column in bytes.it is getting failed while loading in snowflake.I could see size functions avialable to get the length.how to calculate the size in bytes for a column in pyspark dataframe.pyspark.sql.functions.size(col)Collection function: returns the length of the array or map stored in the column.Please help me on this case.ThanksOptions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-17-2020 02:16 PM
There isn't one size for a column; it takes some amount of bytes in memory, but a different amount potentially when serialized on disk or stored in Parquet. You can work out the size in memory from its data type; an array of 100 bytes takes 100 bytes; a long takes 8 bytes, etc.