<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Building Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integration in Community Articles</title>
    <link>https://community.databricks.com/t5/community-articles/building-event-driven-real-time-data-processor-with-spark/m-p/62791#M4</link>
    <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I recently saw an article from Databricks titled "&lt;FONT size="4"&gt;&lt;A href="https://www.databricks.com/blog/scalable-spark-structured-streaming-rest-api-destinations" target="_self"&gt;Scalable Spark Structured Streaming for REST API Destinations&lt;/A&gt;". A great article focusing on continuous&amp;nbsp;Spark Structured Streaming (SSS). About a year old. I then decided, given customer demands to work on "&lt;/FONT&gt;&lt;FONT face="courier new,courier" size="4"&gt;&lt;SPAN&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;&lt;A href="https://www.linkedin.com/pulse/building-event-driven-real-time-data-processor-spark-mich-zy3ef/?trackingId=uha3Lx7o4FXXyeeDWxcvDA%3D%3D" target="_self"&gt;Building an Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integration&lt;/A&gt;&lt;/FONT&gt;". &lt;/SPAN&gt;&lt;/FONT&gt;&lt;SPAN&gt;&lt;FONT face="courier new, courier"&gt;In the fast-paced realm of data processing, the ability to derive actionable insights in real-time is essential for organizations across various domains. My article tries to construct a robust, event-driven, real-time data processor, seamlessly integrating APIs using Apache Spark, REST API, and Flask. The focus is on empowering data engineers and developers to efficiently process streaming data while staying responsive to external events. This article introduces a distinctive approach centred around handling simulated market data. In contrast to conventional scenarios like Databricks article, our architecture comprises two key components: a well-established Bash script, serving as a robust historical financial data generator for various tickers (IBM, MRW, MSFT, among others), and a Python application designed for seamless data transmission to a REST API. The diagram below shows the components. The full article is available from the &lt;/FONT&gt;&lt;FONT face="arial, helvetica, sans-serif"&gt;linkedlin&lt;/FONT&gt;&lt;FONT face="arial, helvetica, sans-serif"&gt;&lt;FONT face="courier new, courier"&gt;&amp;nbsp;&lt;FONT face="arial,helvetica,sans-serif" size="4"&gt;above including the &lt;/FONT&gt;&lt;/FONT&gt;&lt;FONT face="arial,helvetica,sans-serif" size="4"&gt;accompanying&amp;nbsp;&lt;/FONT&gt;&lt;/FONT&gt;&lt;FONT face="arial,helvetica,sans-serif" size="4"&gt;GitHub&amp;nbsp;&lt;/FONT&gt;&lt;FONT size="4"&gt;&lt;FONT&gt;code&lt;/FONT&gt;&lt;/FONT&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 06 Mar 2024 21:09:33 GMT</pubDate>
    <dc:creator>MichTalebzadeh</dc:creator>
    <dc:date>2024-03-06T21:09:33Z</dc:date>
    <item>
      <title>Building Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integration</title>
      <link>https://community.databricks.com/t5/community-articles/building-event-driven-real-time-data-processor-with-spark/m-p/62791#M4</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I recently saw an article from Databricks titled "&lt;FONT size="4"&gt;&lt;A href="https://www.databricks.com/blog/scalable-spark-structured-streaming-rest-api-destinations" target="_self"&gt;Scalable Spark Structured Streaming for REST API Destinations&lt;/A&gt;". A great article focusing on continuous&amp;nbsp;Spark Structured Streaming (SSS). About a year old. I then decided, given customer demands to work on "&lt;/FONT&gt;&lt;FONT face="courier new,courier" size="4"&gt;&lt;SPAN&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;&lt;A href="https://www.linkedin.com/pulse/building-event-driven-real-time-data-processor-spark-mich-zy3ef/?trackingId=uha3Lx7o4FXXyeeDWxcvDA%3D%3D" target="_self"&gt;Building an Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integration&lt;/A&gt;&lt;/FONT&gt;". &lt;/SPAN&gt;&lt;/FONT&gt;&lt;SPAN&gt;&lt;FONT face="courier new, courier"&gt;In the fast-paced realm of data processing, the ability to derive actionable insights in real-time is essential for organizations across various domains. My article tries to construct a robust, event-driven, real-time data processor, seamlessly integrating APIs using Apache Spark, REST API, and Flask. The focus is on empowering data engineers and developers to efficiently process streaming data while staying responsive to external events. This article introduces a distinctive approach centred around handling simulated market data. In contrast to conventional scenarios like Databricks article, our architecture comprises two key components: a well-established Bash script, serving as a robust historical financial data generator for various tickers (IBM, MRW, MSFT, among others), and a Python application designed for seamless data transmission to a REST API. The diagram below shows the components. The full article is available from the &lt;/FONT&gt;&lt;FONT face="arial, helvetica, sans-serif"&gt;linkedlin&lt;/FONT&gt;&lt;FONT face="arial, helvetica, sans-serif"&gt;&lt;FONT face="courier new, courier"&gt;&amp;nbsp;&lt;FONT face="arial,helvetica,sans-serif" size="4"&gt;above including the &lt;/FONT&gt;&lt;/FONT&gt;&lt;FONT face="arial,helvetica,sans-serif" size="4"&gt;accompanying&amp;nbsp;&lt;/FONT&gt;&lt;/FONT&gt;&lt;FONT face="arial,helvetica,sans-serif" size="4"&gt;GitHub&amp;nbsp;&lt;/FONT&gt;&lt;FONT size="4"&gt;&lt;FONT&gt;code&lt;/FONT&gt;&lt;/FONT&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 06 Mar 2024 21:09:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/community-articles/building-event-driven-real-time-data-processor-with-spark/m-p/62791#M4</guid>
      <dc:creator>MichTalebzadeh</dc:creator>
      <dc:date>2024-03-06T21:09:33Z</dc:date>
    </item>
    <item>
      <title>Re: Building Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integrati</title>
      <link>https://community.databricks.com/t5/community-articles/building-event-driven-real-time-data-processor-with-spark/m-p/62844#M6</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/9"&gt;@Retired_mod&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you for your kind words. Of course I will be delighted to contribute to your technical blogs. Let me know how.&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Mich&lt;/P&gt;</description>
      <pubDate>Thu, 07 Mar 2024 09:38:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/community-articles/building-event-driven-real-time-data-processor-with-spark/m-p/62844#M6</guid>
      <dc:creator>MichTalebzadeh</dc:creator>
      <dc:date>2024-03-07T09:38:33Z</dc:date>
    </item>
  </channel>
</rss>

