Hi Team,
Need your inputs here on desiging the pool for our parrallel processing
We are processing around 4 to 5 GB files ( Process having adding a row number, removing header/trailer, adding addition 8 column which calculates over all 104 columns per record ). This is likely a CSV file to delta table storage after performing some basic validations. Each day carries two task ( Validation and Transformation )
When we process conitinuously for 5 days, our jobs are failing with MAX_POOL_CAPACITY error.
Iniitially we have 20 instance and we increased to 40 instances now. Still 1 or 2 jobs failing which is strange?
would you please guide me here ?
No joins, it is straight forward 1:1 load from CSV to delta with some basic data quality checks
Regards, Nantha.