cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

dbutils.notebook.run() fails with job aborted but running the notebook individually works

auser85
New Contributor III

I have a notebook that runs many notebooks in order, along the lines of:

```

%python

notebook_list = ['Notebook1', 'Notebook2']

   

for notebook in notebook_list:

  print(f"Now on Notebook: {notebook}")

  try:

    dbutils.notebook.run(f'{notebook}', 3600)

  except Exception as e:

    print(e)

    pass

```

When I run this notebook, I get job aborted exceptions amounting to the following:

```

Exceptions:

(1): Task failed while writing rows,

(2): Failed to execute user defined function (SQLDriverLocal$$$Lambda$1707/462935920: (string) => string)

(3): No input widget named effective_dating is defined

```

For #2 -- I have no user defined functions; I only use Spark functions.

However, if I run `Notebook1` or `Notebook2` (Run all button) individually, they run as expected.

To make matters more intriguing, if I run the loop above for the *second* time, each notebook will run fine; but **only after** I run each individually.

I would like to have a single notebook that runs many notebooks, rather than setup a pipeline to run each individual notebook on its own.

Running the notebooks as a Workflow/Job also returns the same issue.

3 REPLIES 3

Prabakar
Databricks Employee
Databricks Employee

hi @Andrew Fogartyโ€‹ could you please share the stacktrace? We need to test this to understand what went wrong. If you have a minimal repro, could you please export the .dbc file and add it here?

auser85
New Contributor III

Yes! FailDemo is the .dbc of the notebook loop. RIght after this failure at 7:00:25 AM, I ran the notebook individually with no errors.

notebook on its own

auser85
New Contributor III

I found the problem. Even if a notebook creates and specifies a widget fully, the notebook run process, e.g, dbutils.notebook.run('notebook') will not know how to use it. If I replace my widget with a non-widget provided value, the process works fine.

How can I use my widgets without this failure?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group