![AH_0-1717569489175.png AH_0-1717569489175.png](/t5/image/serverpage/image-id/8044iBE9DD885902E85EB/image-size/medium/is-moderation-mode/true?v=v2&px=400)
I have created 7 job for each business system to extract product data from each postgress source then write all job data into one data lake delta table [raw_product].
each business system product table has around 20 GB of data.
do the same thing for 15 table .
is any way to read and write fast in delta tables
one job looks like the one below
![AH_1-1717572455868.png AH_1-1717572455868.png](/t5/image/serverpage/image-id/8045i9A6026F0E4D48027/image-size/medium/is-moderation-mode/true?v=v2&px=400)
daily day loaded into delta table by using merge command
![AH_3-1717572644640.png AH_3-1717572644640.png](/t5/image/serverpage/image-id/8047i3943B7B9824E3364/image-size/medium/is-moderation-mode/true?v=v2&px=400)
![AH_2-1717572557758.png AH_2-1717572557758.png](/t5/image/serverpage/image-id/8046i538A965F8BFA18D8/image-size/medium/is-moderation-mode/true?v=v2&px=400)