- 447 Views
- 0 replies
- 0 kudos
How To convert the below query to spark sql. especially the isnull REplaceSELECT ID, ISNULL(NAME,"N/A") AS NAME, COMPANYFROM TEST
- 447 Views
- 0 replies
- 0 kudos
- 11971 Views
- 3 replies
- 0 kudos
In SQL select, in some implementation, we can provide select -col_A to select all columns except the col_A.
I tried it in the Spark 1.6.0 as follows:
For a dataframe df with three columns col_A, col_B, col_C
df.select('col_B, 'col_C') # it works
df....
- 11971 Views
- 3 replies
- 0 kudos
Latest Reply
@sk777, @zjffdu, @Lejla Metohajrova
if your columns are time-series ordered OR you want to maintain their original order... use
cols = [c for c in df.columns if c != 'col_A']
df[cols]
2 More Replies