We have a data stream from event hub with approximately 10 million rows per day (into one table) - these records are insert only (no update). We are trying to find a solution to aggregate / group by the data based on multiple data points and our requirements include
- max 30 second delay from event hub to gold table
- comparing current hour to last years max hour based on various parameters (e.g. total transactions, sum of amount, etc.)
We are thinking of using DLT to process live transactions (retaining only current days data in DLT tables) and parallelly moving the data to an archive set of tables on a nightly/periodic basis and re-aggregate and expose as a second dataset, and use both sources to achieve comparison between live and historical data. We have not found similar industry examples and would like to know the positive and negatives of our approach or of an alternate approach.