cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

py4j.protocol.Py4JJavaError: An error occurred while calling o359.sql. : java.util.NoSuchElementExce

deng_dev
New Contributor III

Hi!
We are creating table in streaming job every micro-batch using spark.sql('create or replace table ... using delta as ...') command. This query includes combining data from multiple tables.

Sometimes our job fails with error:

py4j.Py4JException: An exception was raised by the Python Proxy. Return Message: Traceback (most recent call last):
py4j.protocol.Py4JJavaError: An error occurred while calling o359.sql.
: java.util.NoSuchElementException: key not found: Filter (isnotnull(uuid#42326735) AND isnotnull(actor_uuid#42326740))

How can I stop getting this error or add try/except statement to handle it? I was trying this, but it doesn't seem to work and the job continue failing:

from py4j.protocol import Py4JError, Py4JJavaError




try:
    spark.sql.(query)
except Exception as e:
    ## some code
except Py4JError as e:
    ## some code
except Py4JJavaError as e:
    ## some code
1 REPLY 1

Kaniz
Community Manager
Community Manager

Hi @deng_dev , The error message you’re encountering, java.util.NoSuchElementException: key not found: Filter (isnotnull(uuid#42326735) AND isnotnull(actor_uuid#42326740)), indicates that there’s an issue with the query execution.

 

Let’s address this step by step:

 

Check Your Query:

  • Ensure that the query you’re executing is syntactically correct and logically valid. Verify that the table names, column names, and expressions are accurate.
  • Specifically, look for any references to uuid#42326735 and actor_uuid#42326740 in your query. Make sure these columns exist and are correctly spelled.

Data Availability:

  • The error suggests that the keys uuid#42326735 and actor_uuid#42326740 are not found. This could be due to missing data or incorrect column names.
  • Check if the required columns exist in the tables you’re combining. Ensure that the data is available for the specified micro-batch.

Try/Except Block:

  • Your attempt to handle the error using a try/except block is a good approach.

If you encounter specific issues, feel free to share more details, and I’ll assist further! 🚀

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.