cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to make the job fail via code after handling exception

kjoth
Contributor II

Hi , We are capturing the exception if an error occurs using try except. But we want the job status to be failed once we got the exception. Whats the best way to do that.

We are using pyspark.

9 REPLIES 9

Hubert-Dudek
Esteemed Contributor III
try:
 ...
except Exception as error:
  dbutils.notebook.exit(str(jobId) + ' - ERROR!!! - ' + repr(error))

kjoth
Contributor II

Would it work if we are not running notebook based jobs?

Kaniz
Community Manager
Community Manager

Hi @karthick J​ , This link shall help you.

kjoth
Contributor II

Thanks for sharing the link. It doesn't help much for the case.

Hi @karthick J​ ,

Please try the code below:

# Errors in workflows thrown a WorkflowException.

def run_with_retry(notebook, timeout, args = {}, max_retries = 3):

num_retries = 0

while True:

try:

return dbutils.notebook.run(notebook, timeout, args)

except Exception as e:

if num_retries > max_retries:

raise e

else:

print("Retrying error", e)

num_retries += 1

run_with_retry("LOCATION_OF_CALLEE_NOTEBOOK", 60, max_retries = 5)

For more info please check the docs https://docs.databricks.com/notebooks/notebook-workflows.html#handle-errors

Anonymous
Not applicable

Hi @karthick J​ 

Hope you are well.

Just wanted to see if you were able to find an answer to your question?  If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

Cheers!

No, didnt get the solution. Maybe there isnt one for this use case.

lcalca95
New Contributor II

Hi,

I've build a job on a python wheel and I've obtained the opposite behaviour (the job failed). I'm using a try/except like your. I've also tried to create a job from a simple notebook and now the job succeded!

Someone knows why the behaviour changes in relation to the format?

AkA
New Contributor II

Instead of exiting the notebook which make the task/job success, Exception objects needs to be raised again from Exception block to fail the job.

try:

<you code>

except Exception as err:

<your block of exception handling>

raise err

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.