In SQL select, in some implementation, we can provide select -col_A to select all columns except the col_A.
I tried it in the Spark 1.6.0 as follows:
For a dataframe df with three columns col_A, col_B, col_C
df.select('col_B, 'col_C') # it works
df.select(-'col_A') # does not work
df.select(*-'col_A') # does not work
Note, I am trying to find the alternative of df.context.sql("select col_B, col_C ... ") in above script.