cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to read a json from BytesIO with PySpark?

Galdino
New Contributor II

I want read a json from IO variable using PySpark.

My code using pandas:

io = BytesIO()

ftp.retrbinary('RETR '+ file_name, io.write)

io.seek(0)

# With pandas

df = pd.read_json(io)

What I tried using PySpark, but don't work:

io = BytesIO()

ftp.retrbinary('RETR '+ file_name, io.write)

io.seek(0)

df = spark.read\

.format("binaryfile")\ # I already tried with format "json"

.option('inferSchema', 'true')\

.option('header', 'true')\

.json(io)

Obs: Is not possible save in a file and read that after

3 REPLIES 3

Noopur_Nigam
Valued Contributor II
Valued Contributor II

Hi @João Galdino​ There are below points which are incorrect in your spark read command:

1) The Syntax is wrong, it can be

df = spark.read\

.format("binaryFile")

.load("directory of file")

or

df = spark.read\

.format("json")

.load("directory of file")

or

df = spark.read

.json("directory of file")

You need to specify a file format and then provide file's path.

2) Spark expects a source file path, it does not understand BytesIO() object in read statement.

You can refer to below doc for more understanding on what are the supported sources with spark and how to read and write using them.You can refer to below doc:

https://docs.databricks.com/data/data-sources/index.html

Anonymous
Not applicable

Hey there @João Galdino​ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Erik_L
Contributor II

Just use pandas and follow with

spark.createDataFrame(df)

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.