cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

column wise sum in PySpark dataframe

siddhu308
New Contributor II

i have a dataframe of 18000000rows and 1322 column with '0' and '1' value.

want to find how many '1's are in every column ???

below is DataSet

se_00001 se_00007 se_00036 se_00100 se_0010p se_00250

2 REPLIES 2

siddhu308
New Contributor II

seag_00001 seag_00007 seag_00036 seag_00100 seag_0010p seag_00250

1 0 1 0 0 0

mathan_pillai
Valued Contributor
Valued Contributor

Hi Siddhu,

You can use


df.select(sum("col1"), sum("col2"), sum("col3"))
where col1, col2, col3 are the column names for which you would like to find the sum

please let us know if it answers your question

Thanks

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.