I have a notebook that runs many notebooks in order, along the lines of:
```
%python
notebook_list = ['Notebook1', 'Notebook2']
for notebook in notebook_list:
print(f"Now on Notebook: {notebook}")
try:
dbutils.notebook.run(f'{notebook}', 3600)
except Exception as e:
print(e)
pass
```
When I run this notebook, I get job aborted exceptions amounting to the following:
```
Exceptions:
(1): Task failed while writing rows,
(2): Failed to execute user defined function (SQLDriverLocal$$$Lambda$1707/462935920: (string) => string)
(3): No input widget named effective_dating is defined
```
For #2 -- I have no user defined functions; I only use Spark functions.
However, if I run `Notebook1` or `Notebook2` (Run all button) individually, they run as expected.
To make matters more intriguing, if I run the loop above for the *second* time, each notebook will run fine; but **only after** I run each individually.
I would like to have a single notebook that runs many notebooks, rather than setup a pipeline to run each individual notebook on its own.
Running the notebooks as a Workflow/Job also returns the same issue.