Hi @AndriusVitkausk,
This is expected behaviour on Databricks. Code inside foreachBatch runs in the background as part of the longโlived streaming query, so print() and standard logging go to the driver logs / Spark UI, not back to the notebook output. In Databricks Runtime 14.0+, this is explicitly documented: โprint() commands write output to the driver logsโ for foreachBatch on compute with standard access mode.
There isnโt a switch to redirect those logs into the notebook, including on serverless. If you need visibility outside the driver logs, Databricks recommends using structured approaches such as writing metrics or diagnostics to a Delta/UC table from inside foreachBatch, or using StreamingQueryListener + observable metrics to push perโmicrobatch metrics to external monitoring systems. You may find this useful.
If this answer resolves your question, could you mark it as โAccept as Solutionโ? That helps other users quickly find the correct fix.
Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***