When I try to run a simple limit number of rows in Spark using a pushdown query, I receive an error back from spark with a where clause of 1= 0. Meaning it won't return the results and the original sql query was appended by spark.
This seems very odd that spark won't allow a "SELECT TOP 100 * From Table" or "SELECT * FROM Table Limit 100" both of these queries return nothing. It appears spark wants me to bring in the 10 M records then limit them in the data frame. This is crazy and very taxing on the DB.