10-06-2021 06:13 PM
How do you work to fixing the small/big file problem ? what you suggest ?
10-07-2021 09:54 AM
Hi @William Scardua ,
I will recommend to use Delta to avoid having small/big files issues. For example, Auto Optimize is an optional set of features that automatically compact small files during individual writes to a Delta table. Paying a small cost during writes offers significant benefits for tables that are queried actively. For more details and examples please check the following link
Auto optimize will create files of 128 MB each. If you would like to compress and optimize further, then I will recommend to use "Optimize" command on your Delta tables. It will compress and create files of 1 GB in size, by default. For more details on this optimize feature, please check the following link
Thank you.
10-06-2021 11:48 PM
Hi @ William Scardua ! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.
10-07-2021 09:54 AM
Hi @William Scardua ,
I will recommend to use Delta to avoid having small/big files issues. For example, Auto Optimize is an optional set of features that automatically compact small files during individual writes to a Delta table. Paying a small cost during writes offers significant benefits for tables that are queried actively. For more details and examples please check the following link
Auto optimize will create files of 128 MB each. If you would like to compress and optimize further, then I will recommend to use "Optimize" command on your Delta tables. It will compress and create files of 1 GB in size, by default. For more details on this optimize feature, please check the following link
Thank you.
10-11-2021 02:14 PM
Okay @Jose Gonzalez , I understand .. thank you man
10-08-2021 12:01 AM
What Jose said.
If you cannot use delta or do not want to:
the use of coalesce and repartition/partitioning is the way to define the file size.
There is no one ideal file size. It all depends on the use case, available cluster size, data flow downstream etc.
What you do want to avoid is a lot of small files (think only a few megabytes or kilobytes).
But there is nothing wrong with a single file of 2 MB.
That being said: delta lake makes this exercise way easier.
10-11-2021 02:16 PM
thank you for feedback @Werner Stinckens , that`s a good point
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.