cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Error Handling and Custom Messages in Workflows

CaptainJack
New Contributor III

I would like to be able to get custom error's message ideally visible from Workflows > Jobs UI.

1. For example, workflow failed because file was missing and could not find it, in this case I am getting "Status" Failed and "Error Code" RunExecutionError. If I hove cursor on Status's red icon, I am able to see "[RunExecutionError] Workload failed, see run output for details". Is there anyway to have custom error there?

 

2. I would like to use /api/2.1/jobs/runs/get to get some information about error. I did API call and check ['tasks'] key, but all somehow useful which I can see is 'result_state': TERMINATED, state_message = 'Workload failed, see run output for details'. Is it possible to have some custom parameter there which could indicate what error happen? I thought about tag but is it possible to create tag or any other variable while runing notebook task and then read it from runs/get API call?

1 ACCEPTED SOLUTION

Accepted Solutions

Edthehead
Contributor II

What you can do is pass the custom error message you want from the notebook back to the workflow 

output = f"There was an error with {error_code} : {error_msg}"
dbutils.notebook.exit(output)
 
Then when you are fetching the status of your pipeline, you can also check the output of each task. You need to loop through each task runid to get the individual outputs. You would use jobs/runs/get-output with run_id as a parameter. The run_id should be that of the task and not the job itself. Based on the output of the task, your pipeline can understand the custom error. You will find the output in a json variable called 
notebook_output.

View solution in original post

3 REPLIES 3

Walter_C
Databricks Employee
Databricks Employee

Custom error messages in the Workflows > Jobs UI or via the /api/2.1/jobs/runs/get API call are not directly supported. The error messages you're seeing, such as "RunExecutionError", are standard error codes that Databricks uses to indicate the type of error that occurred.

 

CaptainJack
New Contributor III

Any idea for workaround? I thought about tagValue / jobValue as error but I am not sure if it is possible to get these using jobs/runs/get API. If it is really not possible, I am thinking about email notifications as last resort. 

Edthehead
Contributor II

What you can do is pass the custom error message you want from the notebook back to the workflow 

output = f"There was an error with {error_code} : {error_msg}"
dbutils.notebook.exit(output)
 
Then when you are fetching the status of your pipeline, you can also check the output of each task. You need to loop through each task runid to get the individual outputs. You would use jobs/runs/get-output with run_id as a parameter. The run_id should be that of the task and not the job itself. Based on the output of the task, your pipeline can understand the custom error. You will find the output in a json variable called 
notebook_output.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group