cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

notebook returns succeeded even if there is a failure in any of the command

moulitharan
New Contributor

I'm trying to run a notebook that has 14 commands from adf notebook activity, I'm writing the transformed data to a delta table in the last command. I'm handling the last command in try except block to handle errors, and raising error on exception.

when I run the notebook from adf, the job run status shows succeeded even if any of the command fails. I need the notebook status to be returned as failure even if any of the command fails. But if there is a failure in any command the following commands are skipped. Sometimes even on raising exception in try except block, the status of the notebook is still displayed as succeeded.

Is there a way to get the actual status of the notebook job run.

moulitharan_0-1718611669417.png

 

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @moulitharanWhen running a Databricks Notebook using the Databricks Notebook Activity in Azure Data Factory (ADF), the job status is typically reported as โ€œsucceededโ€ even if individual commands within the notebook fail. However, there are ways to handle this and get more accurate status information.

  1. Monitor Databricks Job Status:

    • After running the notebook, you can log in to your Azure Databricks workspace, navigate to the Clusters section, and find the job associated with your notebook.
    • The job status will be displayed as either โ€œpending execution,โ€ โ€œrunning,โ€ or โ€œterminated.โ€
    • Click on the job name to view further details, including any errors or exceptions encountered during execution.
    • This approach allows you to validate the parameters passed and the output of the Python notebook1.
  2. Custom Logic in ADF:

    • To handle the scenario where you want the notebook status to be returned as โ€œfailureโ€ even if any command fails, you can implement custom logic in your ADF pipeline.
    • One approach is to use an additional activity (e.g., a copy data activity) after the notebook activity.
    • In this subsequent activity, you can check the status of the notebook execution (e.g., using an expression like @equals(activity('NotebookActivity').output.executionDetails.status,'Succeeded')).
    • If the status is โ€œSucceeded,โ€ you can raise an error or take other appropriate actions to indicate failure.
    • This way, you can ensure that the overall pipeline status reflects the success or failure of the ent...2.
  3. Pipeline Run History:

If you encounter any specific issues or need further assistance, feel free to ask! ๐Ÿ˜Š

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group