Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-10-2026 02:08 AM
Hi @Mey I would suggest you to use Lakeflow suite(lakeflow connect , lakeflow declaratve pipelines, lakeflow jobs) pipelines from databricks to achieve this event driven incremental workload.Below example is using SQL but you can also use python. read_files is an example of using databricks natively support autoloader for incremental streaming.
example workflow: