<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Error when using pyflink on databricks, An error occurred while trying to connect to the Java server in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28073#M19911</link>
    <description>&lt;P&gt;I have no idea tbh, but it seems like a bad idea.  Databricks = Spark.&lt;/P&gt;&lt;P&gt;If you want to use Flink, I would run a Flink cluster.&lt;/P&gt;&lt;P&gt;If you are ok with spark, you do not need the pyflink library.  Databricks has a Kakfa connector:&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/structured-streaming/kafka.html" target="test_blank"&gt;https://docs.databricks.com/structured-streaming/kafka.html&lt;/A&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 11 Oct 2022 13:19:01 GMT</pubDate>
    <dc:creator>-werners-</dc:creator>
    <dc:date>2022-10-11T13:19:01Z</dc:date>
    <item>
      <title>Error when using pyflink on databricks, An error occurred while trying to connect to the Java server</title>
      <link>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28070#M19908</link>
      <description>&lt;P&gt;Hi, right now I am trying to run a pyflink script that can connect to a kafka server. When I run that script, I got an error "An error occurred while trying to connect to the Java server 127.0.0.1:35529". Do I need to install a extra jdk for that? &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;error message:&lt;/P&gt;&lt;P&gt;Py4JNetworkError: An error occurred while trying to connect to the Java server (127.0.0.1:35529)&lt;/P&gt;&lt;P&gt;---------------------------------------------------------------------------&lt;/P&gt;&lt;P&gt;IndexError                                Traceback (most recent call last)&lt;/P&gt;&lt;P&gt;/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in _get_connection(self)&lt;/P&gt;&lt;P&gt;    976         try:&lt;/P&gt;&lt;P&gt;--&amp;gt; 977             connection = self.deque.pop()&lt;/P&gt;&lt;P&gt;    978         except IndexError:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;IndexError: pop from an empty deque&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;During handling of the above exception, another exception occurred:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;ConnectionRefusedError                    Traceback (most recent call last)&lt;/P&gt;&lt;P&gt;/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in start(self)&lt;/P&gt;&lt;P&gt;   1114         try:&lt;/P&gt;&lt;P&gt;-&amp;gt; 1115             self.socket.connect((self.address, self.port))&lt;/P&gt;&lt;P&gt;   1116             self.stream = self.socket.makefile("rb")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;ConnectionRefusedError: [Errno 111] Connection refused&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;During handling of the above exception, another exception occurred:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Py4JNetworkError                          Traceback (most recent call last)&lt;/P&gt;&lt;P&gt;&amp;lt;command-1981093759856396&amp;gt; in &amp;lt;module&amp;gt;&lt;/P&gt;&lt;P&gt;     88 &lt;/P&gt;&lt;P&gt;     89 &lt;/P&gt;&lt;P&gt;---&amp;gt; 90 main()&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&amp;lt;command-1981093759856396&amp;gt; in main()&lt;/P&gt;&lt;P&gt;      5 def main():&lt;/P&gt;&lt;P&gt;      6     # Create streaming environment&lt;/P&gt;&lt;P&gt;----&amp;gt; 7     env = StreamExecutionEnvironment.get_execution_environment()&lt;/P&gt;&lt;P&gt;      8 &lt;/P&gt;&lt;P&gt;      9     settings = EnvironmentSettings.new_instance()\&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;/databricks/python/lib/python3.8/site-packages/pyflink/datastream/stream_execution_environment.py in get_execution_environment()&lt;/P&gt;&lt;P&gt;    801         """&lt;/P&gt;&lt;P&gt;    802         gateway = get_gateway()&lt;/P&gt;&lt;P&gt;--&amp;gt; 803         j_stream_exection_environment = gateway.jvm.org.apache.flink.streaming.api.environment\&lt;/P&gt;&lt;P&gt;    804             .StreamExecutionEnvironment.getExecutionEnvironment()&lt;/P&gt;&lt;P&gt;    805         return StreamExecutionEnvironment(j_stream_exection_environment)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in __getattr__(self, name)&lt;/P&gt;&lt;P&gt;   1690             return UserHelpAutoCompletion()&lt;/P&gt;&lt;P&gt;   1691 &lt;/P&gt;&lt;P&gt;-&amp;gt; 1692         answer = self._gateway_client.send_command(&lt;/P&gt;&lt;P&gt;   1693             proto.REFLECTION_COMMAND_NAME +&lt;/P&gt;&lt;P&gt;   1694             proto.REFL_GET_UNKNOWN_SUB_COMMAND_NAME + name + "\n" + self._id +&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in send_command(self, command, retry, binary)&lt;/P&gt;&lt;P&gt;   1029          if `binary` is `True`.&lt;/P&gt;&lt;P&gt;   1030         """&lt;/P&gt;&lt;P&gt;-&amp;gt; 1031         connection = self._get_connection()&lt;/P&gt;&lt;P&gt;   1032         try: &lt;/P&gt;</description>
      <pubDate>Mon, 10 Oct 2022 16:10:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28070#M19908</guid>
      <dc:creator>witnessthee</dc:creator>
      <dc:date>2022-10-10T16:10:44Z</dc:date>
    </item>
    <item>
      <title>Re: Error when using pyflink on databricks, An error occurred while trying to connect to the Java server</title>
      <link>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28071#M19909</link>
      <description>&lt;P&gt;did you get Flink running on the Databricks cluster?  Because that seems to be the issue here.&lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2022 13:12:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28071#M19909</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2022-10-11T13:12:32Z</dc:date>
    </item>
    <item>
      <title>Re: Error when using pyflink on databricks, An error occurred while trying to connect to the Java server</title>
      <link>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28072#M19910</link>
      <description>&lt;P&gt;Thanks for reply. May I know how to run Flink on Databricks clusters?&lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2022 13:16:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28072#M19910</guid>
      <dc:creator>witnessthee</dc:creator>
      <dc:date>2022-10-11T13:16:49Z</dc:date>
    </item>
    <item>
      <title>Re: Error when using pyflink on databricks, An error occurred while trying to connect to the Java server</title>
      <link>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28073#M19911</link>
      <description>&lt;P&gt;I have no idea tbh, but it seems like a bad idea.  Databricks = Spark.&lt;/P&gt;&lt;P&gt;If you want to use Flink, I would run a Flink cluster.&lt;/P&gt;&lt;P&gt;If you are ok with spark, you do not need the pyflink library.  Databricks has a Kakfa connector:&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/structured-streaming/kafka.html" target="test_blank"&gt;https://docs.databricks.com/structured-streaming/kafka.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2022 13:19:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/error-when-using-pyflink-on-databricks-an-error-occurred-while/m-p/28073#M19911</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2022-10-11T13:19:01Z</dc:date>
    </item>
  </channel>
</rss>

