HI,
I am running a Notebook job calling a JAR code (application code implmented in C#). in the Spark UI page for almost 2 hrs, it'w not showing any tasks and even the CPU usage is below 20%, memory usage is very small. Before this 2 hr window it shows taks and CPU usage.
from application code, it is writing to 2 tables. and before the 2 hr no task window, the data was already written to 1st table. after after this 2 hr window, it again started showing tasks for writing to 2 table.
job is running on 7.3 LTS with total 300 core and should be processing less than 40 million rows
Is this something UX issue Or actually the job was not doing any task. Attached is how UX shows.
I am trying to improve the performance, so any pointers on how to understand and optimize what job is doing this period will be helpful.