how to check the particular column value in spark dataframe ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2023 04:37 AM
if i want to check the the particular column in dataframe is need to contain zero, if its not have zero means , it need to get fail
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2023 08:03 AM
use the agg method to check if the count of rows where columnName contains 0, is equal to the total number of rows in the dataframe, using the following code: df.agg(count("*").alias("total_count"),count(when(col("columnName")===0,1)).alias("zero_count")).filter("total_count == zero_count").count()
This will return 1 if all the rows contain 0 in the columnName and 0 otherwise
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-16-2023 04:34 AM
no it doesn't worked, actually what i want is have one column in spark dataframe as count, for that particular column value is greater than 0 means, it needs to exit.
but the above answer is totally different

