I'm consuming multiple topics from confluent kafka and process each row with business rules using Spark structured streaming (.writestream and .foreach()). While doing that i call other notebook using %run and call the class via foreach while performing write stream. the class has open(), process(), close() and other methods to apply business rules for each topic and load them to database. One record from a topic has some error and it fails while calling the method that applies business rules. Is there an easy way to step in to the notebook during run time (by adding breakpoints) and to see what record is passed and where (which line) it is failing?
I did Google search and the best option i see is to do a print statement and check stdout from spark worker node. I still couldn't locate my print statement in stdout. Is there a better approach to debug this scenario?