cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Spark adding NUL

vasu_sethia
New Contributor II

Hi I have a DF which contains Json string so the value is like {"key": Value, "anotherKey": anotherValue}, so when I am trying to write the DF containing this string to the CSV, spark is โ€‹adding NUL character af the front of this line and at the end, so the final line is like

NUL{"โ€‹key": Value, "anotherKey": anotherValue}NUL

I really don't want this to happen, how can I prevent this?

The code I am using is

df.coalesce(1).write.format("csv").option("header", false).option("quote", "").save(path)โ€‹

8 REPLIES 8

Piper_Wilson
New Contributor III

Hello, @Vasu Sethiaโ€‹! My name is Piper and I'm one of the moderators for Databricks. Welcome and thank you for your question. Let's give it a bit longer to see what the community has to say. Otherwise, we'll circle back around soon.

-werners-
Esteemed Contributor III

Are you writing the actual json string in a csv, or do you flatten the json into a table structure and write that to csv?

I have a value in my dataframe column in the format of Json string, I am trying to write the dataframe to the csv

_________

Value

__________

โ€‹

{"Name": ABC, "age": 12}

__________โ€‹

-werners-
Esteemed Contributor III

Hard to tell without having the code, but it might be the separator for the csv? You do have comma's in the string, and comma is the default separator for csv.

df.coalesce(1).write.format("csv").option("header", false).option("quote", "").save(path)โ€‹

This is the code and yes I do have comma in the string โ€‹

-werners-
Esteemed Contributor III

I mean the code for 'df'.

Can you try to write with option("sep", ";")?

Thank you so much, this worked for meโ€‹

hi @Vasu Sethiaโ€‹ ,

If Werners' response fully answered your question, would you be happy to mark the answer as best so that others can quickly find the solution?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.