โ11-29-2021 12:37 AM
Hi,
I would like to capture notebook custom log exceptions(python) from ADF pipeline based on the exceptions pipeline should got succeed or failed.
Is there any mechanism to implement it. In my testing ADF pipeline is successful irrespective of the log errors.
Notebook always returns SUCCESS do adf's activity, even exception is raised in notebook.If a notebook contains any exceptions then adf pipeline which contains that particular notebook activity should fail
Thank you
โ11-29-2021 01:45 AM
Thank you for your response.
My question is not to store/get the log info.
My scenario is like:
Notebook always returns SUCCESS do adf's activity, even exception is raised in notebook.If a notebook contains any exceptions then adf pipeline which contains that particular notebook activity should fail.
โ11-29-2021 01:20 AM
Hi @Sailaja Bโ the notebook errors will be tracked in the driver log4j output. You can check the cluster's driver logs to get this information. Or you can set logging to your cluster so that all the messages will be logged in the dbfs or storage path that you provide.
Please refer to the document.
https://docs.databricks.com/clusters/configure.html#cluster-log-delivery-1
โ11-29-2021 01:45 AM
Thank you for your response.
My question is not to store/get the log info.
My scenario is like:
Notebook always returns SUCCESS do adf's activity, even exception is raised in notebook.If a notebook contains any exceptions then adf pipeline which contains that particular notebook activity should fail.
โ11-29-2021 02:08 AM
Hi @Sailaja Bโ notebook/job fails to happen when there is really a failure. Some exceptions are information that might not hurt the running notebook. To understand better, please share the exception that you see in the notebook output.
โ11-29-2021 02:18 AM
Here is the sample code
if not any(mount.mountPoint == "/test/" for mount in dbutils.fs.mounts()):
dbutils.fs.mount(source = "***",
mount_point = "/test/",
extra_configs = configs)
else:
logger.error("Directory is already mounted")
Note : Mount path is already existed.. If I run this notebook through ADF pipeline, I am expecting that pipeline should fails but it is not getting failed.
โ11-29-2021 10:42 AM
Hi @Sailaja Bโ ,
You will beed to raise/throw the error exception to stop your Spark execution. Try to use a try..catch statement block handle your custom exceptions.
โ11-30-2021 11:40 PM
Hi @Jose Gonzalezโ ,
Thank you for your reply..
It is working as expected with try .. exception..assert False..
โ09-07-2022 01:36 AM
Hi @Sailaja Bโ Is it working now?I mean, Is pipeline showing failure when notebook failed.If yes please share a sample snippet?(I am also trying the same case .I am able capture logs from pipeline side using output Json but couldn't modify pipeline status)
โ11-29-2021 01:23 AM
Next to the mentioned option, there is also the possibility to analyze logs using Azure Log Analytics:
โ11-29-2021 02:32 AM
Also when you catch exception you can just save it anywhere even to Databricks Table something like:
try:
(...)
except Exception as error:
spark.sql(f"""INSERT INTO (...) """", repr(error))
dbutils.notebook.exit(str(jobId) + ' - ERROR!!! - ' + repr(error))
In my opinion as @werners said is good choice to send to Azure Log Analytics for detailed analysis but I like also to use also above method and just have nice table in databricks with jobs which failed ๐
โ12-21-2021 10:50 PM
Hi SailajaB,
Try this out.
Notebook, once executed successfully return a long JSON formatted output. We need to specify appropriate nodes to fetch the output.
In below screenshot we can see that when notebook ran it returns empName & empCity as output.
To capture this, we need to:
@activity(''YOUR NOTEBOOK ACTIVITY NAME').output.runOutput.an_object.name.value
Note: If you want to specify custom return value then you need to use :
dbutils.notebook.exit('VALUE YOU WANT TO RETURN')
Let me know how it goes.
Cheers
GS
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group