cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to read a json from BytesIO with PySpark?

Galdino
New Contributor II

I want read a json from IO variable using PySpark.

My code using pandas:

io = BytesIO()

ftp.retrbinary('RETR '+ file_name, io.write)

io.seek(0)

# With pandas

df = pd.read_json(io)

What I tried using PySpark, but don't work:

io = BytesIO()

ftp.retrbinary('RETR '+ file_name, io.write)

io.seek(0)

df = spark.read\

.format("binaryfile")\ # I already tried with format "json"

.option('inferSchema', 'true')\

.option('header', 'true')\

.json(io)

Obs: Is not possible save in a file and read that after

3 REPLIES 3

Noopur_Nigam
Databricks Employee
Databricks Employee

Hi @João Galdino​ There are below points which are incorrect in your spark read command:

1) The Syntax is wrong, it can be

df = spark.read\

.format("binaryFile")

.load("directory of file")

or

df = spark.read\

.format("json")

.load("directory of file")

or

df = spark.read

.json("directory of file")

You need to specify a file format and then provide file's path.

2) Spark expects a source file path, it does not understand BytesIO() object in read statement.

You can refer to below doc for more understanding on what are the supported sources with spark and how to read and write using them.You can refer to below doc:

https://docs.databricks.com/data/data-sources/index.html

Anonymous
Not applicable

Hey there @João Galdino​ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Erik_L
Contributor II

Just use pandas and follow with

spark.createDataFrame(df)

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group