cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

IllegalArgumentException: requirement failed: Result for RPC Some(e100cace-3836-4461-8902-80b3744fcb6b) lost, please retry your request.

JohanRex
New Contributor II

I'm using databricks connect to talk to a cluster on Azure. When doing a count on a dataframe I sometimes get this error message. Once I've gotten it once I don't seem to be able to get rid of it even if I restart my dev environment.

---------------------------------------------------------------------------
IllegalArgumentException                  Traceback (most recent call last)
c:\git\cap-mlv\notebooks\tmp_lookup_table.ipynb Cell 13' in <module>
----> 1 print(lookup_table_new.count())
 
File ~\AppData\Local\Continuum\anaconda3\envs\dbconnect\lib\site-packages\pyspark\sql\dataframe.py:670, in DataFrame.count(self)
    660 def count(self):
    661     """Returns the number of rows in this :class:`DataFrame`.
    662 
    663     .. versionadded:: 1.3.0
   (...)
    668     2
    669     """
--> 670     return int(self._jdf.count())
 
File ~\AppData\Local\Continuum\anaconda3\envs\dbconnect\lib\site-packages\py4j\java_gateway.py:1304, in JavaMember.__call__(self, *args)
   1298 command = proto.CALL_COMMAND_NAME +\
   1299     self.command_header +\
   1300     args_command +\
   1301     proto.END_COMMAND_PART
   1303 answer = self.gateway_client.send_command(command)
-> 1304 return_value = get_return_value(
   1305     answer, self.gateway_client, self.target_id, self.name)
   1307 for temp_arg in temp_args:
   1308     temp_arg._detach()
 
File ~\AppData\Local\Continuum\anaconda3\envs\dbconnect\lib\site-packages\pyspark\sql\utils.py:123, in capture_sql_exception.<locals>.deco(*a, **kw)
    119 converted = convert_exception(e.java_exception)
    120 if not isinstance(converted, UnknownException):
    121     # Hide where the exception came from that shows a non-Pythonic
    122     # JVM exception message.
--> 123     raise converted from None
    124 else:
    125     raise
 
IllegalArgumentException: requirement failed: Result for RPC Some(c4a9d516-8a33-4574-bf03-b3403c5d1b45) lost, please retry your request.

Using databricks runtime 9.1 LTS.

How can I narrow the problem down further?

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

Hi @Johan Rex​ We checked with databricks connect team, this issue can happen when the library is too large to upload,

Databricks recommends that you use dbx by Databricks Labs for local development instead of Databricks Connect. Databricks plans no new feature development for Databricks Connect at this time. Also, be aware of the limitations of Databricks Connect.

https://docs.databricks.com/dev-tools/databricks-connect.html

https://docs.databricks.com/dev-tools/dbx.html

View solution in original post

4 REPLIES 4

Hubert-Dudek
Esteemed Contributor III

I think it is issue with databricks-connect. It run to some connections problem. Soon databricks tunnel will be available so you will run code from your IDE on databricks (currently it is on spark cluster)

JohanRex
New Contributor II

@Hubert Dudek​ Yes it's likely related to the communication with the cluster. But it would be great if it was possible to narrow it down more so I can possibly find a way forward. It's pretty much stopping me right now.

I'm looking forward to the tunnel but until then I need to get stuff done. 😕

Anonymous
Not applicable

Hi @Johan Rex​ We checked with databricks connect team, this issue can happen when the library is too large to upload,

Databricks recommends that you use dbx by Databricks Labs for local development instead of Databricks Connect. Databricks plans no new feature development for Databricks Connect at this time. Also, be aware of the limitations of Databricks Connect.

https://docs.databricks.com/dev-tools/databricks-connect.html

https://docs.databricks.com/dev-tools/dbx.html

Kaniz
Community Manager
Community Manager

Hi @Johan Rex​ , We haven't heard from you on my last response, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.