cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Dynamic Spark Structured Streaming: Handling Stream-Stream Joins with Changing

tranbau
New Contributor

I want to create a simple application using Spark Structured Streaming to alert users (via email, SMS, etc.) when stock price data meets certain requirements.

I have a data stream: data_stream

However, I'm strugging to address the main issue: how users can modify the requirements (settings) whenever they wish.

I'm considering using another stream called settings_stream and then joining the two streams. But, I've realized that joining these streams will result in numerous unnecessary alerts to users. For instance, when users change settings, the new requirements will filter through all old price data (not just the last price), similarly when new price data arrives.

For example:

Price | Timestamp

1 | 00:00
2 | 00:01 3 | 00:05

With new settings: price >= 1 => 3 emails

Or:

PriceRule | Timestamp

=1 | 00:01

=2 | 00:05

With the latest price: price = 1 => 2 emails

How can I handle this situation? Or could you provide some solutions for this use case besides stream-stream join?

P.S.: My project must utilize Kafka and Spark Streaming/Structured Streaming.

Thank you all so much!"

0 REPLIES 0