cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

My notebooks running in parallel no longer include jobids

Benedetta
New Contributor III

Hey Databricks - what happened to the jobids that used to be returned from parallel runs? We used them to identify which link matched the output. See attached. How are we supposed to match up the links?

 

6 REPLIES 6

Kaniz
Community Manager
Community Manager

Hi @Benedetta, In Databricks, parallel runs involve executing multiple tasks concurrently. 

 

Here are some key points related to job IDs and parallel execution:

 

Job IDs in Parallel Runs:

  • Previously, Databricks returned job IDs for parallel runs. These IDs helped identify which link corresponded to the output of a specific task.
  • However, recent changes might have impacted the availability of these job IDs. Let’s explore alternative approaches to achieve your goal.

Matching Links to Outputs:

  • To match up links with specific outputs, consider the following strategies:
    • Task-Level Parameters: Configure parameters at the job level that are passed to all tasks. These parameters can include information relevant to identifying outputs.
    • Dynamic Value References: Use dynamic value references to share context between jobs and tasks. These references allow you to pass information dynamically.
    • Tags: Add tags (labels or key-value attributes) to your job. Tags can help you filter and organize jobs based on specific criteria.

Cluster Execution Contexts:

  • Databricks imposes a hard limit of 145 active execution contexts on a cluster. This ensures the cluster isn’t overloaded with too many parallel threads competing for resources.
  • If you have more than 145 parallel jobs, consider creating a new cluster to manage the load effectively.

Editing Job Configuration:

  • To modify job settings, follow these steps:
    • Click Workflows in the sidebar.
    • Locate the job name in the Name column and click it.
    • You can adjust various configurations in the side panel, including triggers, compute settings, notifications, and maximum concurrent runs.
    • If job access control is enabled, you can also edit permissions.

If you encounter any specific challenges, please ask for further assistance! 😊

Benedetta
New Contributor III

Hi Kaniz,

     Thank you for your prompt reply. It is greatly appreciated.  These job clusters are spun up on demand. There are no Job Names, and nothing to configure or reconfigure in the Workflows tab. We are already passing parameters to the child notebooks and have been capturing the job id along the way to identify which ephemeral notebook we may wish to troubleshoot. Previously, this was an easy way to identify the associated link with the messaging passed back from the child notebook. I do notice that Ctrl-F and a search on a specific "parameter value" will highlight the "new link" as well as the messaging from the child notebook, indicating which link to click on. However, Ctrl-F and a search on the job id does not highlight the link, leaving one to hunt and peck for the correct link, if no parameter value is available. Honestly? This seems like an oversight...... The job id is readily available in the task run details - why not keep it surfaced? I'm lucky that my design already includes the parameters and that the Ctrl-F can find them, but for designs that didn't include this......could be a rewrite.

Thank you for your attention,

Benedetta

 

Kaniz
Community Manager
Community Manager

Our End-of-Year Community Survey is here! Please take a few moments to complete the survey. Your feedback matters!

Benedetta
New Contributor III

This issue seems to have resolved itself - the jobids are again available for viewing and clicking on to get to the ephemeral notebook.

Did Databricks roll back a version of the UI?

Thanks!

Benedetta

Benedetta
New Contributor III

Hello,

    This problem (feature?) is occurring again today. The job ids are obscured from the new output that Databricks is providing. The original png file attached still represents the issue. Additionally, and potentially a worse problem, is that the ephemeral notebooks are missing - the notebook links provided merely point to the notebook which was run, not the actual run of that notebook. How do we troubleshoot this? what's going on?

Thank you for your attention

Benedetta Manocchio

Benedetta
New Contributor III

Hey @Databricks, @Kaniz  - waccha doing? Yesterday's "newer version of the app" that got rolled out seems to have broken the parallel runs. The ephemeral notebook is missing. The job ids are missing. What's up? 

Benedetta

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!