Hi @Faisal ,
As a general best practice, you should retain as much history data in the Bronze layer as is necessary to ensure data quality and accuracy.
One way to decide on the retention period could be to consider the following factors:
-
Reconciliation and Auditing: Retain enough data to support any reconciliation, auditing, or compliance checks that may be necessary. This will depend on the regulatory or business requirements of your organization.
-
Data Latency: Retain enough data to maintain a sufficient window of time for your data pipelines to capture and process batch and streaming updates. This will depend on the overall latency requirements and your data pipeline architecture.
-
Data Size and Cost: Retain enough data so downstream consumers like Silver tables don't miss relevant updates. However, the amount of data should be reasonable enough to avoid incurring unnecessary storage costs.
Based on these factors, storing as much history data as necessary to meet your business requirements is best. In general, you should aim to retain at least a few days or weeks worth of data to provide an excellent window to capture the incremental updates, depending on the frequency of the IO and the rate of data accumulation.
However, also be mindful of the impact of having too much historical data on query performance and data processing times for your data applications. In any case, the ingestion of the history data should only be done once, and after that, only the incremental changes should be captured. The data retention policy can be revised over time based on changes to your business, regulatory or performance requirements.