How to filter the Spark UI for a notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-26-2023 08:43 AM
When running spark under yarn each script has it's own self contained set of logs:-
In databricks all I see if a list of jobs and stages that have been run on the cluster:-
From a support perspective this is a nightmare.
How can notebooks logs be grouped together so it is possible to see the all the activities that notebook has performed?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-28-2023 10:54 AM
@Dean Lovelace : I can think of one way, does it help solve?
In Databricks, each notebook has a unique identifier called a "run ID" that can be used to filter the Spark UI to show only the activities performed by that notebook. Here's how you can filter the Spark UI for a notebook in Databricks:
- Run the notebook in Databricks.
- Once the notebook has completed or while it's running, navigate to the "Clusters" tab in the Databricks workspace.
- Click on the "View spark UI" button next to the cluster that was used to run the notebook.
- In the Spark UI, click on the "Application ID" link near the top of the page.
- This will take you to the "Application Detail" page. Look for the "Run ID" field, and copy the value.
- Go back to the Spark UI homepage and click on the "Filters" dropdown menu near the top of the page.
- In the "Filters" menu, click on "Add Filter" and select "Tag".
- In the "Tag" field, enter the value of the "Run ID" that you copied earlier.
- Click on "Apply".
After applying the filter, the Spark UI will only show the activities performed by the notebook with the specified run ID. This can be helpful for troubleshooting or auditing purposes, as it allows you to easily track the activities of a particular notebook.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-09-2023 01:45 AM
Steps 4-9 don't work for me. I don't see any "Application ID" link or any way of filtering in the spark UI.
I am using Databricks in Azure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-30-2023 11:25 PM
Hi @Dean Lovelace
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!