โ11-03-2021 09:26 AM
โ11-04-2021 05:30 PM
I agree with you @Werner Stinckensโ . I will always do optimize to try to compact the files more (default size is 1 gb) and then run vacuum. Vacuum will remove the old files that are no longer needed or have been marked as obsoleted by the optimize command.
โ11-03-2021 09:26 AM
I am asking some questions as nobody is writing today ๐
โ11-03-2021 09:28 AM
Thank you!
โ11-03-2021 09:30 AM
In my opinion as Vacuum remove old files so is better to do Vacuum first so there will be less files to optimize but maybe in some cases it will be faster to delete optimized already files (so optimize first) โ๏ธ
โ11-04-2021 05:17 AM
I optimize first as delta lake knows which files are relevant for the optimize. Like that I have my optimized data available faster. Then a vacuum. Seemed logical to me, but I might be wrong. Never actually thought about it
โ11-04-2021 05:30 PM
I agree with you @Werner Stinckensโ . I will always do optimize to try to compact the files more (default size is 1 gb) and then run vacuum. Vacuum will remove the old files that are no longer needed or have been marked as obsoleted by the optimize command.
a month ago
What about ReOrg delta table https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-reorg-table
Does it help or make sense to add Re-org then Optimize - Vacuum every week?
Reorganize a Delta Lake table by rewriting files to purge soft-deleted data, such as the column data dropped by ALTER TABLE DROP COLUMN.
Does it mean, it only touches the delta logs for deleted columns or data and not the actual data?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group