โ03-15-2022 08:33 AM
I am trying to set retention period for a delta by using following commands.
deltaTable = DeltaTable.forPath(spark,delta_path)
spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")
deltaTable.logRetentionDuration = "interval 1 days"
deltaTable.deletedFileRetentionDuration = "interval 1 days"
These commands are not working for me, I mean, they aren't removing any files for the given interval..where am I going wrong?
โ03-15-2022 08:56 AM
There are two ways:
1) Please set in cluster (Clusters -> edit -> Spark -> Spark config):
spark.databricks.delta.retentionDurationCheck.enabled false
2) or just before DeltaTable.forPath set (I think you need to change order in your code):
spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")
โ03-15-2022 08:56 AM
There are two ways:
1) Please set in cluster (Clusters -> edit -> Spark -> Spark config):
spark.databricks.delta.retentionDurationCheck.enabled false
2) or just before DeltaTable.forPath set (I think you need to change order in your code):
spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")
โ03-16-2022 07:58 AM
Hi @Manasa Kalluriโ , It seems @Hubert Dudekโ has given a comprehensive solution. Were you able to solve your problem?
โ03-16-2022 11:39 PM
Hi @Kaniz Fatmaโ , Yes I was able to solve the issue! Thanks
โ03-16-2022 11:37 PM
Hi @Hubert Dudekโ , thanks for you response!
โ03-17-2022 12:47 AM
Hi @Manasa Kalluriโ , Thank you for the update. Would you like to mark @Hubert Dudekโ 's answer as "Best" which would help our community members hereafter ๐ ?
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.