What is the timeout for dbutils.notebook.run, timeout = 0 ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2022 09:17 PM
Hello everyone,
I have several notebooks (around 10) and I want to run them in a sequential order. At first I thought of using %run but I have a variable that is repeatedly used in every notebook.
So now I am thinking to pass that variable from one main notebook (so that it is easier to change that variable manually only at one place instead of changing that in every notebook variable is being used)
dbutils.notebook.run(path = "test2", arguments={"current_year": current_year }, timeout_seconds = 0)
However, I found in the documentation that this command will fail if the notebook takes more than 10 min irrespective of the timeout we declare.
So, I want to know will the command work even when the notebook takes more than 10 min.
When I checked this command using a 13 min notebook, the dbutils.notebook.run worked?
Sometimes, my notebook might take more than an hour so need some suggestions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-03-2022 01:06 AM
Hi @pavan venkata
Yes, as the document says 0 means no timeout. It means that the notebook will take it's sweet time to complete execution without throwing an error due to a time limit. Be it if the notebook takes 1 min or 1 hour or 1 day or more. However, there is a limit of 30 days for job runs. you can find that in the same documentation place.
That 10 minutes you are seeing is for the databricks downtime. It has nothing to do with your code because obviously you can't do anything when databricks service itself is not working.
Cheers..
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-19-2022 05:35 PM
Thank You very much for replying @Uma Maheswara Rao Desula .
So, in the end it means it is alright to use it as long as the notebook doesn't take too long.
Also, Can you explain a bit more about databricks downtime. Are you talking about when databricks does maintenance or is it about the cluster being terminated ?
Thank You
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-19-2022 09:27 PM
Yes...It no longer cares about how long the notebook runs. As for the downtime, it is when there is a downtime in databricks service. If cluster gets terminated, you would receive a error for cluster termination itself.

