Hi,
We have a requirement fir a scenario to reprocess old data using data factory pipeline.Here are the details
Storage in ADLSGEN2
Landing zone(where the data will be stored in the same format as we get from source),Data will be loaded from sql server to ADLS gen2 using
data pieline copy activity)
Bronze layer(Data from landing zone will be copied to bronze layer by converting it to delta tables,this is done using Azure Databricks notebooks
which runs pyspark code)
Silver and gold layer(Runs databricks notebook python code)
Now our requirment is,we get data daily through files,Landing zone will have archive of that data for 7 days where as bronze layer is truncate and load everyday.
We need to build a reprocess logic where in if we pass the date as parameter it should trigger the flow and take the old files wrt date we passed and start processing from the landing zone .Could you please help me with this