cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

The functionality of table property delta.logRetentionDuration

Priyanka48
Contributor

We have one project requirement where we have to store only the 14 days history for delta tables. So for testing, I have set the delta.logRetentionDuration = 2 days using the below command

spark.sql("alter table delta.`[delta_file_path]` set TBLPROPERTIES (โ€™delta.logRetentionDuration'='interval 2 daysโ€™)")

However, I tried it after specific intervals, i.e., (after two days) I can still time travel back to previous versions. Do we need to run Vacuum after setting this property, or it works only for >30 days?

Can I please get help on this?

Also, will it physically delete the data files or will only log files be deleted?

4 REPLIES 4

UmaMahesh1
Honored Contributor III

Hi @Priyanka Maneโ€‹ ,

Quick notes:

You need both the log and data files to time-travel to a previous version.

Vacuum - does not delete the log files. It only deletes the data files, which are never deleted automatically unless you run the vacuum. Log files are automatically cleaned up after new checkpoints are added.

logRetentionDuration - Each time a checkpoint is written, Databricks automatically cleans up log entries older than the retention interval. In your case, when a new checkpoint is written, it clears the logs older than 2 days. Once this happens, you should not be able to do time travel as log files are now unavailable for that version.

And to delete the data files associated with the logs, you have to run a vacuum, as there is no other way to delete the data.

logRetentionDuration takes any calendar interval like x days, x weeks etc. Months and years are not accepted.

And finally, all these would help only when you are doing a new transaction, so there is a new checkpoint for logretentionduration.

I hope these details help.

Cheers.

Uma Mahesh D

Thanks for the suggestion. I have set log retention duration for 2 days and I am performing a transaction on it after 2 days. It has not deleted older logs and I can time travel back to previous versions

UmaMahesh1
Honored Contributor III

Adding some blogs for your reading..

https://mungingdata.com/delta-lake/vacuum-command/

youtube.com/watch?v=F91G4RoA8is

https://docs.databricks.com/delta/history.html

Uma Mahesh D

-werners-
Esteemed Contributor III

Hi, by default there is a safety interval enabled. So if you set a retentionperiod lower than that interval (7 days), data in that interval will not be deleted.

You have to specificall override this safety interval by setting

spark.databricks.delta.retentionDurationCheck.enabled to false.

Then vacuum and the data will be gone.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group