03-15-2022 08:33 AM
I am trying to set retention period for a delta by using following commands.
deltaTable = DeltaTable.forPath(spark,delta_path)
spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")
deltaTable.logRetentionDuration = "interval 1 days"
deltaTable.deletedFileRetentionDuration = "interval 1 days"
These commands are not working for me, I mean, they aren't removing any files for the given interval..where am I going wrong?
03-15-2022 08:56 AM
There are two ways:
1) Please set in cluster (Clusters -> edit -> Spark -> Spark config):
spark.databricks.delta.retentionDurationCheck.enabled false
2) or just before DeltaTable.forPath set (I think you need to change order in your code):
spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")
03-15-2022 08:56 AM
There are two ways:
1) Please set in cluster (Clusters -> edit -> Spark -> Spark config):
spark.databricks.delta.retentionDurationCheck.enabled false
2) or just before DeltaTable.forPath set (I think you need to change order in your code):
spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")
03-16-2022 07:58 AM
Hi @Manasa Kalluri , It seems @Hubert Dudek has given a comprehensive solution. Were you able to solve your problem?
03-16-2022 11:39 PM
Hi @Kaniz Fatma , Yes I was able to solve the issue! Thanks
03-16-2022 11:37 PM
Hi @Hubert Dudek , thanks for you response!
03-17-2022 12:47 AM
Hi @Manasa Kalluri , Thank you for the update. Would you like to mark @Hubert Dudek 's answer as "Best" which would help our community members hereafter 😊 ?
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.