cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Delay rows coming into DLT pipeline

Mathias
New Contributor II


Backgroundand requirements: We are reading data from our factory and storing it in a DLT table called telemetry with columns sensorid, timestamp and value. We need to get rows where sensorid is โ€œqrreader-xโ€ and join with some other data from that same table and store elsewhere. The qr codeโ€™s are coming in with very low latency, much less than some of the sensors they should be joined with. There is need for a delay to wait for the other data coming in before processing these rows.

Suggestion: Can we create a DLT pipeline that would be run as batch every 5 minutes and only read rows that have timestamp older than x minutes?

 

@Dlt.table(
comment="A table summarizing counts of the top baby names for New York for 2021."
)
def top_baby_names_2021():
return (
dlt.read("baby_names_prepared")
.filter(expr("Year_Of_Birth == 2021"))
)

 

When looking at the above example code, I assume the pipeline would consider all rows coming in as handled, even though they are filtered out. So is there a way to put the filtered rows back into the pipeline if they are too fresh?

1 REPLY 1

raphaelblg
Databricks Employee
Databricks Employee

Hi @Mathias

I'd say that watermarking might be a good solution for your use case. Please check Control late data threshold with multiple watermark policy in Structured Streaming. 

If you want to dig-in further there's also: Spark Structured Streaming Programming Guide - Handling Late Data and Watermarking.

There are other ways to achieve what you're aiming for, I think it's more of a design decision.

 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group