cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Optimize and Vacuum - which is the best order of operations?

Hubert-Dudek
Esteemed Contributor III

Optimize -> Vacuum

or

Vacuum -> Optimize

1 ACCEPTED SOLUTION

Accepted Solutions

I agree with you @Werner Stinckensโ€‹ . I will always do optimize to try to compact the files more (default size is 1 gb) and then run vacuum. Vacuum will remove the old files that are no longer needed or have been marked as obsoleted by the optimize command.

View solution in original post

6 REPLIES 6

Hubert-Dudek
Esteemed Contributor III

I am asking some questions as nobody is writing today ๐Ÿ™‚

Thank you!

Hubert-Dudek
Esteemed Contributor III

In my opinion as Vacuum remove old files so is better to do Vacuum first so there will be less files to optimize but maybe in some cases it will be faster to delete optimized already files (so optimize first) โ”๏ธ

-werners-
Esteemed Contributor III

I optimize first as delta lake knows which files are relevant for the optimize. Like that I have my optimized data available faster. Then a vacuum. Seemed logical to me, but I might be wrong. Never actually thought about it

I agree with you @Werner Stinckensโ€‹ . I will always do optimize to try to compact the files more (default size is 1 gb) and then run vacuum. Vacuum will remove the old files that are no longer needed or have been marked as obsoleted by the optimize command.

shadowinc
New Contributor III

What about ReOrg delta table https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-reorg-table

Does it help or make sense to add Re-org then Optimize - Vacuum every week?

Reorganize a Delta Lake table by rewriting files to purge soft-deleted data, such as the column data dropped by ALTER TABLE DROP COLUMN.
Does it mean, it only touches the delta logs for deleted columns or data and not the actual data?

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group