cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

The functionality of table property delta.logRetentionDuration

Priyanka48
New Contributor III

We have one project requirement where we have to store only the 14 days history for delta tables. So for testing, I have set the delta.logRetentionDuration = 2 days using the below command

spark.sql("alter table delta.`[delta_file_path]` set TBLPROPERTIES (’delta.logRetentionDuration'='interval 2 days’)")

However, I tried it after specific intervals, i.e., (after two days) I can still time travel back to previous versions. Do we need to run Vacuum after setting this property, or it works only for >30 days?

Can I please get help on this?

Also, will it physically delete the data files or will only log files be deleted?

5 REPLIES 5

UmaMahesh1
Honored Contributor III

Hi @Priyanka Mane​ ,

Quick notes:

You need both the log and data files to time-travel to a previous version.

Vacuum - does not delete the log files. It only deletes the data files, which are never deleted automatically unless you run the vacuum. Log files are automatically cleaned up after new checkpoints are added.

logRetentionDuration - Each time a checkpoint is written, Databricks automatically cleans up log entries older than the retention interval. In your case, when a new checkpoint is written, it clears the logs older than 2 days. Once this happens, you should not be able to do time travel as log files are now unavailable for that version.

And to delete the data files associated with the logs, you have to run a vacuum, as there is no other way to delete the data.

logRetentionDuration takes any calendar interval like x days, x weeks etc. Months and years are not accepted.

And finally, all these would help only when you are doing a new transaction, so there is a new checkpoint for logretentionduration.

I hope these details help.

Cheers.

Thanks for the suggestion. I have set log retention duration for 2 days and I am performing a transaction on it after 2 days. It has not deleted older logs and I can time travel back to previous versions

UmaMahesh1
Honored Contributor III

Adding some blogs for your reading..

https://mungingdata.com/delta-lake/vacuum-command/

youtube.com/watch?v=F91G4RoA8is

https://docs.databricks.com/delta/history.html

-werners-
Esteemed Contributor III

Hi, by default there is a safety interval enabled. So if you set a retentionperiod lower than that interval (7 days), data in that interval will not be deleted.

You have to specificall override this safety interval by setting

spark.databricks.delta.retentionDurationCheck.enabled to false.

Then vacuum and the data will be gone.

Kaniz_Fatma
Community Manager
Community Manager

Hi @Priyanka Mane​, We haven’t heard from you since the last response from @Werner Stinckens​ and @Uma Maheswara Rao Desula​, and I was checking back to see if their suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!