Hi @Itai_Sharon,
When you call a Databricks notebook from another notebook (or from a Python file) using dbutils.notebook.run(), if the child notebook fails, the default behavior is for the error in the child to be wrapped and propagated up as a generic WorkflowException with a general message like: com.databricks.NotebookExecutionException: FAILED: Workload failed, see run output for details.
This means that the original exception (e.g., FileNotFoundError) and its stacktrace are not automatically surfaced to the caller—only the general workflow failure message appears.
If you run the same notebook manually, you do see the full, specific Python error log with stacktrace in the notebook UI.
- The error thrown by dbutils.notebook.run() is always a WorkflowException, which wraps the real error from the child notebook.
- There is no automatic surfacing of the nested error's stacktrace or details in the value returned to the caller.
I am sharing official documentation for your reference:https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-workflows#handle-errors