Pyspark CSV Incorrect Count
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-27-2022 06:58 AM
B1123451020-502,"","{""m"": {""difference"": 60}}","","","",2022-02-12T15:40:00.783Z
B1456741975-266,"","{""m"": {""difference"": 60}}","","","",2022-02-04T17:03:59.566Z
B1789753479-460,"","",",","","",2022-02-18T14:46:57.332Z
B1456741977-123,"","{""m"": {""difference"": 60}}","","","",2022-02-04T17:03:59.566Z
df_inputfile = (spark.read.format("com.databricks.spark.csv")
.option("inferSchema", "true")
.option("header","false")
.option("quotedstring",'\"')
.option("escape",'\"')
.option("multiline","true")
.option("delimiter",",")
.load('<path to csv>'))
print(df_inputfile.count()) # Prints 3
print(df_inputfile.distinct().count()) # Prints 4
I'm trying to read the data above from a CSV file and end up with a wrong count, although the dataframe contains all the expected records. df_inputfile.count() prints 3 although it should have been 4.
It looks like this is happening because of the single comma in the 4th column of the 3rd row. Can someone please explain why?
- Labels:
-
CSV
-
Difference
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2022 11:23 PM
Hi, Could you please check the syntax? '\"' ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2022 12:57 AM
Hi Debayan, there's no syntax error in the code snippet. Using .option("escape",'"') makes no difference to the counts. I still get wrong counts.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2022 01:00 AM
Hi @Kaniz Fatma Unfortunately, the suggestion hasn't helped and I've not been able to figure out the reason for the strange results so far.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2022 07:43 PM
Hi @Tarique Anwer
Hope all is well!
Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
![](/skins/images/8C2A30E5B696B676846234E4B14F2C7B/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/8C2A30E5B696B676846234E4B14F2C7B/responsive_peak/images/icon_anonymous_message.png)