In spark UI, I can see the application running with the application ID, from this spark UI, could I able to see the which notebook is running with that applications is this possible?
I am interested in learning more about the jobs, stage how it works internally with spark with the help of spark UI in databricks, could you guide me(share the learning agenda or maerial or online course to check on it etc.. ) the path to learn more Indepth on it.
Note: as a first step I need to understand on how one complete notebook excution is taken care with respect to jobs and stage in the pyspark.
Thanks.