HI @Artem Sheikoโ,
Thank you for your detailed reply, I do understand what you are referring to, but there are no requirements to process the data in batches, also its just a replica of original transactional database, so without changing anything in terms of transformation we need to copy data to delta lake.
Considering a small team do we really need to split up the data stream to many small single table streams? How that will impact system performance, as per my understanding with streaming we start big and then gradually break it down to smaller streams if required?
If you can refer me some documentation, that will be really helpful.
Thanks