11-28-2021 11:31 AM
What happens if we change the logic for the delta live tables and we do an incremental update. Does the table get reset (refresh) automatically or would it only apply the logic to new incoming data? would we have to trigger a reset in this case?
11-29-2021 07:47 AM
Here is my finding on when to refresh (reset) the table:
If it is a complete table all the changes would be apply automatically.
If the table is incremental table, you need to do a manually reset (full refresh).
11-28-2021 11:34 PM
I doubt the table gets reset, as that would mean you could never change anything in a current setup.
Delta live was created to make life easier, so my guess is that it is new data only.
11-29-2021 02:37 AM
Delta is transactional so nothing will be reset, it can be deleted but you can always back to past using time capsule travel 🙂
One thing to which maybe you are referring is partitioning, when you overwrite data it will overwrite all partitions but you can use dynamic overwrite so you will only overwrite partitions which have new data:
spark.conf.set("spark.sql.sources.partitionOverwriteMode", "dynamic")
11-29-2021 07:47 AM
Here is my finding on when to refresh (reset) the table:
If it is a complete table all the changes would be apply automatically.
If the table is incremental table, you need to do a manually reset (full refresh).
11-29-2021 10:47 AM
Hi @Mojgan Mazouchi ,
According to the docs "Tables can be incremental or complete. Incremental tables support updates based on continually arriving data without having to recompute the entire table. A complete table is entirely recomputed with each update."
11-29-2021 10:58 AM
That implies that if there is any logic change for complete tables since it is recomputed on every update it is safe to not refresh the pipeline, where as with incremental tables if there is any change to recompute the changes we should refresh the pipeline, right?
12-10-2021 03:52 PM
According to the docs, yes.
12-12-2021 10:30 AM
Thanks Jose!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group