cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

The functionality of table property delta.logRetentionDuration

Priyanka48
New Contributor III

We have one project requirement where we have to store only the 14 days history for delta tables. So for testing, I have set the delta.logRetentionDuration = 2 days using the below command

spark.sql("alter table delta.`[delta_file_path]` set TBLPROPERTIES (โ€™delta.logRetentionDuration'='interval 2 daysโ€™)")

However, I tried it after specific intervals, i.e., (after two days) I can still time travel back to previous versions. Do we need to run Vacuum after setting this property, or it works only for >30 days?

Can I please get help on this?

Also, will it physically delete the data files or will only log files be deleted?

5 REPLIES 5

UmaMahesh1
Honored Contributor III

Hi @Priyanka Maneโ€‹ ,

Quick notes:

You need both the log and data files to time-travel to a previous version.

Vacuum - does not delete the log files. It only deletes the data files, which are never deleted automatically unless you run the vacuum. Log files are automatically cleaned up after new checkpoints are added.

logRetentionDuration - Each time a checkpoint is written, Databricks automatically cleans up log entries older than the retention interval. In your case, when a new checkpoint is written, it clears the logs older than 2 days. Once this happens, you should not be able to do time travel as log files are now unavailable for that version.

And to delete the data files associated with the logs, you have to run a vacuum, as there is no other way to delete the data.

logRetentionDuration takes any calendar interval like x days, x weeks etc. Months and years are not accepted.

And finally, all these would help only when you are doing a new transaction, so there is a new checkpoint for logretentionduration.

I hope these details help.

Cheers.

Thanks for the suggestion. I have set log retention duration for 2 days and I am performing a transaction on it after 2 days. It has not deleted older logs and I can time travel back to previous versions

UmaMahesh1
Honored Contributor III

Adding some blogs for your reading..

https://mungingdata.com/delta-lake/vacuum-command/

youtube.com/watch?v=F91G4RoA8is

https://docs.databricks.com/delta/history.html

-werners-
Esteemed Contributor III

Hi, by default there is a safety interval enabled. So if you set a retentionperiod lower than that interval (7 days), data in that interval will not be deleted.

You have to specificall override this safety interval by setting

spark.databricks.delta.retentionDurationCheck.enabled to false.

Then vacuum and the data will be gone.

Kaniz
Community Manager
Community Manager

Hi @Priyanka Maneโ€‹, We havenโ€™t heard from you since the last response from @Werner Stinckensโ€‹ and @Uma Maheswara Rao Desulaโ€‹, and I was checking back to see if their suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.