Identify the associated notenook for the application running from the spark UI
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-04-2024 08:35 AM
In spark UI, I can see the application running with the application ID, from this spark UI, could I able to see the which notebook is running with that applications is this possible?
I am interested in learning more about the jobs, stage how it works internally with spark with the help of spark UI in databricks, could you guide me(share the learning agenda or maerial or online course to check on it etc.. ) the path to learn more Indepth on it.
Note: as a first step I need to understand on how one complete notebook excution is taken care with respect to jobs and stage in the pyspark.
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-05-2024 06:49 AM
spark.setJobDescription("my name") will make your life easier. Just put it in the notebook.
You should also put it after each action (show, count, toPandas, write, aggregation etc.)

