https://docs.databricks.com/notebooks/notebook-workflows.html

dbutils.notebook.run() - run other notebook from main notebook

dbutils.notebook.exit("failed") - quit notebook and can return status to main notebook (it can be in except block)

With all this command you can implement any own logic.

I also use azue data factory to run databricks notebook as with data factory you can nice handle many data flow scenarios depend on task success/failure/completion/timeout etc.


My blog: https://databrickster.medium.com/