cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Displaying job-run progress when submitting jobs via databricks-sdk

jmeidam
New Contributor

When I run notebooks from within a notebook using `dbutils.notebook.run`, I see a nice progress table that updates automatically, showing the execution time, the status, links to the notebook and it is seamless.

My goal now is to execute many notebooks or Python scripts in parallel and have Spark figure out when to run what and what resources to use. Essentially, I want to create a job with a dynamic number of parallel tasks and submit that. I can do that using the submit method of the WorkspaceClient jobs API. That works exactly as intended, however, the results are just a linear collection of prints to the standard output. I want a nice progress table such as the one I get when using `dbutils.notebook.run`. This is very convenient for navigating to failed notebooks/scripts and having a clear overview of all the task results in that job-run.

Do any of you know how this progress table is generated, and can I reproduce that in my own processes and loops?

This is the table I want to see:
progress tableprogress table

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group