<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Spark Streaming foreachBatch with Databricks connect in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95810#M39178</link>
    <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/105910"&gt;@olivier-soucy&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Yup, I've used the same code. Also my setup is similiar, also running on Mac but M1.&lt;BR /&gt;&lt;BR /&gt;It would be best to monitor the Driver logs when running the query, as most of the errors are logged there.&lt;/P&gt;</description>
    <pubDate>Thu, 24 Oct 2024 06:10:21 GMT</pubDate>
    <dc:creator>daniel_sahal</dc:creator>
    <dc:date>2024-10-24T06:10:21Z</dc:date>
    <item>
      <title>Spark Streaming foreachBatch with Databricks connect</title>
      <link>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95565#M39123</link>
      <description>&lt;P&gt;I'm trying to use the foreachBatch method of a Spark Streaming DataFrame with databricks-connect. Given that spark connect supported was added to &amp;nbsp;`foreachBatch` in 3.5.0, I was expecting this to work.&lt;/P&gt;&lt;P&gt;Configuration:&lt;BR /&gt;- DBR 15.4 (Spark 3.5.0)&lt;BR /&gt;- databricks-connect 15.4.2&lt;/P&gt;&lt;P&gt;Code:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;import os
from databricks.connect import DatabricksSession

# Setup
spark = DatabricksSession.builder.clusterId("0501-011833-vcux5w7j").getOrCreate()

# Execute
df = spark.readStream.table("brz_stock_prices_job")

def update_metrics(batch_df, batch_id):
    size = batch_df.count()
    print(f"Batch size: {size}")

writer = df.writeStream.foreachBatch(update_metrics).start()&lt;/LI-CODE&gt;&lt;P&gt;Error:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;  File "/Users/osoucy/miniconda3/envs/lac/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py", line 2149, in _handle_rpc_error
    raise convert_exception(
pyspark.errors.exceptions.connect.SparkConnectGrpcException: (java.io.IOException) 

Connection reset by peer
JVM stacktrace:
java.io.IOException
	at sun.nio.ch.FileDispatcherImpl.read0(FileDispatcherImpl.java:-2)
	at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
	at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
	at sun.nio.ch.IOUtil.read(IOUtil.java:197)
	at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:379)
	at sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:208)
	at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
	at java.io.DataInputStream.readInt(DataInputStream.java:387)
	at org.apache.spark.api.python.StreamingPythonRunner.init(StreamingPythonRunner.scala:206)
	at org.apache.spark.sql.connect.planner.StreamingForeachBatchHelper$.$anonfun$pythonForeachBatchWrapper$3(StreamingForeachBatchHelper.scala:146)
        [...]
	at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:561)&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;Any help would be appreciated!&lt;/P&gt;</description>
      <pubDate>Tue, 22 Oct 2024 16:53:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95565#M39123</guid>
      <dc:creator>olivier-soucy</dc:creator>
      <dc:date>2024-10-22T16:53:46Z</dc:date>
    </item>
    <item>
      <title>Re: Spark Streaming foreachBatch with Databricks connect</title>
      <link>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95692#M39151</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/105910"&gt;@olivier-soucy&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Are you sure that you're using DBR 15.4 and databricks-connect 15.4.2?&lt;BR /&gt;I've seen this issue when using databricks-connect 15.4.x with DBR 14.3LTS.&lt;BR /&gt;&lt;BR /&gt;Anyway, I've just tested that with the same versions you've provided and it works on my end.&lt;/P&gt;</description>
      <pubDate>Wed, 23 Oct 2024 11:07:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95692#M39151</guid>
      <dc:creator>daniel_sahal</dc:creator>
      <dc:date>2024-10-23T11:07:57Z</dc:date>
    </item>
    <item>
      <title>Re: Spark Streaming foreachBatch with Databricks connect</title>
      <link>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95737#M39165</link>
      <description>&lt;P&gt;Hi Daniel!&lt;BR /&gt;&lt;BR /&gt;Thanks for getting back to me!&lt;BR /&gt;&lt;BR /&gt;Yes it DBR 15.4. It's a single user access mode if that makes a difference. Here is a screenshot of the configuration:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="oliviersoucy_0-1729692305990.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/12226iC1323C5649543FB5/image-size/medium?v=v2&amp;amp;px=400" role="button" title="oliviersoucy_0-1729692305990.png" alt="oliviersoucy_0-1729692305990.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;And here are the exact version of the pyton packages I'm using:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;pyspark version:  3.5.0
grpcio version:  1.67.0
DB connect version:  15.4.2
pyspark version (from db connect):  3.5.0&lt;/LI-CODE&gt;&lt;P&gt;I'm on a M2 mac. Can't that make a difference? Have you used exactly the same code as I provided to run your tests?&lt;/P&gt;</description>
      <pubDate>Wed, 23 Oct 2024 14:38:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95737#M39165</guid>
      <dc:creator>olivier-soucy</dc:creator>
      <dc:date>2024-10-23T14:38:39Z</dc:date>
    </item>
    <item>
      <title>Re: Spark Streaming foreachBatch with Databricks connect</title>
      <link>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95810#M39178</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/105910"&gt;@olivier-soucy&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Yup, I've used the same code. Also my setup is similiar, also running on Mac but M1.&lt;BR /&gt;&lt;BR /&gt;It would be best to monitor the Driver logs when running the query, as most of the errors are logged there.&lt;/P&gt;</description>
      <pubDate>Thu, 24 Oct 2024 06:10:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/95810#M39178</guid>
      <dc:creator>daniel_sahal</dc:creator>
      <dc:date>2024-10-24T06:10:21Z</dc:date>
    </item>
    <item>
      <title>Re: Spark Streaming foreachBatch with Databricks connect</title>
      <link>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/97143#M39433</link>
      <description>&lt;P&gt;Good call! I should have thought of doing that myself. It turned out that my local machine had a different python version when compared to the workers. Updating python solved this issue. Thank you so much for the help!&lt;/P&gt;</description>
      <pubDate>Fri, 01 Nov 2024 02:50:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/spark-streaming-foreachbatch-with-databricks-connect/m-p/97143#M39433</guid>
      <dc:creator>olivier-soucy</dc:creator>
      <dc:date>2024-11-01T02:50:57Z</dc:date>
    </item>
  </channel>
</rss>

