Could you please try below:1) spark.databricks.delta.retentionDurationCheck.enabled to false.2) Vacuum with location e.g.VACUUM delta.`/data/events/` RETAIN 100 HOURS -- vacuum files not required by versions more than 100 hours old
You might want to watch this as well https://www.confluent.io/resources/online-talk/innovate-faster-and-easier-with-confluent-and-databricks-on-azure/?utm_medium=sem&utm_source=google&utm_campaign=ch.sem_br.nonbrand_tp.prs_tgt.dsa_mt.dsa_rgn.india_ln...