I am wondering how to create complex streaming queries using Delta Live Tables (DLT). I can't find a way to use foreachBatch with it, and this is causing me some difficulty. I need to create a window using a lag without a time range, which is not possible in structured streaming but is fully achievable using foreachBatch in structured streaming.
Is there a way to overcome this limitation?
Additionally, if I create a materialized view in DLT to achieve the same functionality as foreachBatch in Spark Structured Streaming, will it process only the new available batches automatically based on some change data feed, or will it recompute everything again? If it processes the batches incrementally, are materialized views in some way equivalent to the Spark foreachBatch?