Hi isyed,Apologies for the late response.For our use case, we tried to change the code from pyspark dataframes to spark sql, which instead of keeping all the records into the memory, writes to the tables and then perform next loop. Ours is typical hi...
Hi @shahabm , I'm facing exactly the same issue and increasing driver type or number of workers isn't helping too. Could you please guide me how it got resolved for you as I don't see the comment or post in which you got advice. This problem causing ...