cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to make the job fail via code after handling exception

kjoth
Contributor II

Hi , We are capturing the exception if an error occurs using try except. But we want the job status to be failed once we got the exception. Whats the best way to do that.

We are using pyspark.

10 REPLIES 10

Hubert-Dudek
Esteemed Contributor III
try:
 ...
except Exception as error:
  dbutils.notebook.exit(str(jobId) + ' - ERROR!!! - ' + repr(error))

kjoth
Contributor II

Would it work if we are not running notebook based jobs?

Kaniz_Fatma
Community Manager
Community Manager

Hi @karthick Jโ€‹ , This link shall help you.

Thanks for sharing the link. It doesn't help much for the case.

Hi @karthick Jโ€‹ ,

Please try the code below:

# Errors in workflows thrown a WorkflowException.

def run_with_retry(notebook, timeout, args = {}, max_retries = 3):

num_retries = 0

while True:

try:

return dbutils.notebook.run(notebook, timeout, args)

except Exception as e:

if num_retries > max_retries:

raise e

else:

print("Retrying error", e)

num_retries += 1

run_with_retry("LOCATION_OF_CALLEE_NOTEBOOK", 60, max_retries = 5)

For more info please check the docs https://docs.databricks.com/notebooks/notebook-workflows.html#handle-errors

Anonymous
Not applicable

Hi @karthick Jโ€‹ 

Hope you are well.

Just wanted to see if you were able to find an answer to your question?  If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

Cheers!

No, didnt get the solution. Maybe there isnt one for this use case.

lcalca95
New Contributor II

Hi,

I've build a job on a python wheel and I've obtained the opposite behaviour (the job failed). I'm using a try/except like your. I've also tried to create a job from a simple notebook and now the job succeded!

Someone knows why the behaviour changes in relation to the format?

AkA
New Contributor II

Instead of exiting the notebook which make the task/job success, Exception objects needs to be raised again from Exception block to fail the job.

try:

<you code>

except Exception as err:

<your block of exception handling>

raise err

kumar_ravi
New Contributor III

you can do some hack arround

   dbutils = get_dbutils(spark)
    tables_with_exceptions = []
    for table_config in table_configs:
        try:
            process(spark, table_config)
        except Exception as e:
            exception_detail = f"Error processing table {table_config.table_name}: {e}"
            tables_with_exceptions.append(exception_detail)

    if tables_with_exceptions:
        for exception in tables_with_exceptions:
            print(exception)
        raise RuntimeError(f"Job failed with exceptions: {tables_with_exceptions}")
    else:
        dbutils.notebook.exit("completed without exceptions")

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group