cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Warehousing & Analytics
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Dynamic Spark Structured Streaming: Handling Stream-Stream Joins with Changing

tranbau
New Contributor

I want to create a simple application using Spark Structured Streaming to alert users (via email, SMS, etc.) when stock price data meets certain requirements.

I have a data stream: data_stream

However, I'm strugging to address the main issue: how users can modify the requirements (settings) whenever they wish.

I'm considering using another stream called settings_stream and then joining the two streams. But, I've realized that joining these streams will result in numerous unnecessary alerts to users. For instance, when users change settings, the new requirements will filter through all old price data (not just the last price), similarly when new price data arrives.

For example:

Price | Timestamp

1 | 00:00
2 | 00:01 3 | 00:05

With new settings: price >= 1 => 3 emails

Or:

PriceRule | Timestamp

=1 | 00:01

=2 | 00:05

With the latest price: price = 1 => 2 emails

How can I handle this situation? Or could you provide some solutions for this use case besides stream-stream join?

P.S.: My project must utilize Kafka and Spark Streaming/Structured Streaming.

Thank you all so much!"

0 REPLIES 0
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.