how to set retry attempt and how to set email alert with error message of databricks notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-07-2021 02:57 AM
how to set retry attempt in the data bricks notebook in term of like if any cmd /cell get fails that times that particular cmd/cell should be rerun for purpose of connection issue etc.
- Labels:
-
Databricks notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-07-2021 03:12 AM
- you can just implement try/except in cell, handling it by using dbutils.notebook.exit(jobId) and using other dbutils can help,
- when job fail you can specify your email to get job alerts,
- additionally if notebook job fail you can specify retry in job task settings
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-07-2021 03:26 AM
- "you can just implement try/except in cell, handling it by using dbutils.notebook.exit(jobId) and using other dbutils can help,
@HubertDudek As i am fresher in the databricks ,Could you please suggest /explain me in detail
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-07-2021 04:36 AM
https://docs.databricks.com/notebooks/notebook-workflows.html
dbutils.notebook.run() - run other notebook from main notebook
dbutils.notebook.exit("failed") - quit notebook and can return status to main notebook (it can be in except block)
With all this command you can implement any own logic.
I also use azue data factory to run databricks notebook as with data factory you can nice handle many data flow scenarios depend on task success/failure/completion/timeout etc.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-07-2021 10:11 PM
Hi,
I also use azue data factory to run databricks notebook as with data factory you can nice handle many data flow scenarios depend on task success/failure/completion/timeout etc. -- Can't we implement this type of scenario in Databricks using Multiple tasks job.
Thank you

