cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Size of a column in Bytes for a Pyspark Data frame

Anbazhagananbut
New Contributor II

Hello All,

I have a column in a dataframe which i struct type.I want to find the size of the column in bytes.it is getting failed while loading in snowflake.

I could see size functions avialable to get the length.how to calculate the size in bytes for a column in pyspark dataframe.

pyspark.sql.functions.size(col)

Collection function: returns the length of the array or map stored in the column.

Please help me on this case.

Thanks

1 REPLY 1

sean_owen
Honored Contributor II
Honored Contributor II

There isn't one size for a column; it takes some amount of bytes in memory, but a different amount potentially when serialized on disk or stored in Parquet. You can work out the size in memory from its data type; an array of 100 bytes takes 100 bytes; a long takes 8 bytes, etc.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.