cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel processing of json files in databricks pyspark

AzureDatabricks
New Contributor III

How we can read files from azure blob storage and process parallel in databricks using pyspark.

As of now we are reading all 10 files at a time into dataframe and flattening it.

Thanks & Regards,

Sujata

5 REPLIES 5

-werners-
Esteemed Contributor III

if you use the spark json reader, it will happen in parallel automatically.

Depending on the cluster size, you will be able to read more files in parallel.

Mind that json usually are small files. Spark does not like a lot of small files, so performance may suffer.

Depending on the use case it can be a good idea to do an initial conversion to parquet/delta lake (which will take some time because of multiple small files), and then keep on adding new files to this table.

For your data jobs, you can read the parquet/delta lake which will be a lot faster.

AzureDatabricks
New Contributor III

can you provide us sample to read read json files parallel from blob. We are reading all files one by one from directory it is taking time to load into data frame

Thank you

-werners-
Esteemed Contributor III

spark.read.json("/mnt/dbfs/<ENTER PATH OF JSON DIR HERE>/*.json

you first have to mount your blob storage to databricks, I assume that is already done.

https://spark.apache.org/docs/latest/sql-data-sources-json.html

SailajaB
Valued Contributor III

Thank you.. We are using mount already..

Hi @Sailaja B​ ,

Check the number of stages and task when you are reading the JSON files. How many do you see? are you JSON files nested? how long does it takes to read a single JSON file?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.