Is it possible to have 2 notebooks in a DLT pipeline, with the first notebook reading from topic1 in Kafka and writing to a DLT and the second notebook reading from this DLT, applying some data transformations and write streaming to a topic2 in Kafka...