Hi
As part of our requirement we wanted to load a huge historical data from the Source System to Databricks in Bronze and then process it to Gold, We wanted to use batch with read and Write so that the historical load is done and then for the delta or Incremental load we wanted to use the readstream and writestream for the same table with checkpoint so that the tracking for incremental happens automatically. We wanted to use this approach as it was not possible to use streams for the historical load and later once this is done we wanted to use streams as the delta load will happen more frequent for every 15 mins. Any approaches on how this can be implemented.