cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

notebook returns succeeded even if there is a failure in any of the command

moulitharan
New Contributor

I'm trying to run a notebook that has 14 commands from adf notebook activity, I'm writing the transformed data to a delta table in the last command. I'm handling the last command in try except block to handle errors, and raising error on exception.

when I run the notebook from adf, the job run status shows succeeded even if any of the command fails. I need the notebook status to be returned as failure even if any of the command fails. But if there is a failure in any command the following commands are skipped. Sometimes even on raising exception in try except block, the status of the notebook is still displayed as succeeded.

Is there a way to get the actual status of the notebook job run.

moulitharan_0-1718611669417.png

 

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @moulitharanWhen running a Databricks Notebook using the Databricks Notebook Activity in Azure Data Factory (ADF), the job status is typically reported as “succeeded” even if individual commands within the notebook fail. However, there are ways to handle this and get more accurate status information.

  1. Monitor Databricks Job Status:

  2. Custom Logic in ADF:

    • To handle the scenario where you want the notebook status to be returned as “failure” even if any command fails, you can implement custom logic in your ADF pipeline.
    • One approach is to use an additional activity (e.g., a copy data activity) after the notebook activity.
    • In this subsequent activity, you can check the status of the notebook execution (e.g., using an expression like @equals(activity('NotebookActivity').output.executionDetails.status,'Succeeded')).
    • If the status is “Succeeded,” you can raise an error or take other appropriate actions to indicate failure.
    • This way, you can ensure that the overall pipeline status reflects the success or failure of the ent...2.
  3. Pipeline Run History:

If you encounter any specific issues or need further assistance, feel free to ask! 😊

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group