cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

New strange error on Runtime 12 and above: java.lang.AssertionError: assertion failed

mortenhaga
Contributor

Hi all

I struggle to find out why this error message suddenly pops up after running a cell in a notebook.

The notebook is trying to run a simple "INSERT INTO" command in SQL. When I only do a SELECT clause, the cell runs without error. Also, I only get this error message when running Databricks runtime 12 and 12.1 but 11.3LTS runs both INSERT INTO and SELECT without errors. The weird thing is that this error suddenly popped up after a week when I switched to runtime 12 from 11.3. Now I have reverted back to 11.3 so that my workflows can run.

Can someone help decyphering what this error message means?

Thanks!

Py4JJavaError                             Traceback (most recent call last)
File <command-3863647901736418>:7
      5     display(df)
      6     return df
----> 7   _sqldf = ____databricks_percent_sql()
      8 finally:
      9   del ____databricks_percent_sql
 
File <command-3863647901736418>:4, in ____databricks_percent_sql()
      2 def ____databricks_percent_sql():
      3   import base64
----> 4   df = spark.sql(base64.standard_b64decode("aW5zZXJ0IGludG8gZW5yaWNoZWQuZGFpbHlfcGVyaW9kaWNfc25hcHNob3RfZmFjdCAoCiAgLypnZXQgYWxsIHRoZSBzYWxlaWRzIHRoYXQgZXhpc3RzIGluIGVucmljaGVkLnRyYW5zYWN0aW9uc19mYWN0Ki8gIAogIHdpdGggc2FsZWlkcyBhcyAoCiAgCnNlbGVjdCBkaXN0aW5jdCBTYWxlSURTb3VyY2UgZnJvbSBlbnJpY2hlZC50cmFuc2FjdGlvbnNfZmFjdAogIAopLApjb250cmFjdF9kaW0gYXMgKAogIAogCiAgc2VsZWN0CiAgICAqLAogICAgCiAgICBST1dfTlVNQkVSKCkgT1ZFUiAoCiAgICAgIFBBUlRJVElPTiBCWSBTYWxlSURTb3VyY2UKICAgICAgT1JERVIgQlkKICAgICAgICBSZWdpc3RlcmVkIGRlc2MKICAgICkgYXMgcm93bnVtYmVyCiAgZnJvbQogICAgZW5yaWNoZWQuY29udHJhY3RfZGltCiAgCiAgd2hlcmUKICAgIFNhbGVJRFNvdXJjZSBpbiAoCiAgICAgIHNlbGVjdAogICAgICAgIGRpc3RpbmN0IFNhbGVJRFNvdXJjZQogICAgICBmcm9tCiAgICAgICAgc2FsZWlkcwogICAgKQogIAogICAgYW5kIERBVEUoUmVnaXN0ZXJlZCkgPD0gREFURSgnMjAyMy0wMS0yNicpCiksCnN0dWRlbnRfZGltIGFzICgKICAKICBzZWxlY3QKICAgICosCiAgICAKICAgIFJPV19OVU1CRVIoKSBPVkVSICgKICAgICAgUEFSVElUSU9OIEJZIENvbnRhY3RJRFNvdXJjZQogICAgICBPUkRFUiBCWQogICAgICAgIFJlZ2lzdGVyZWQgZGVzYwogICAgKSBhcyByb3dudW1iZXIKICBmcm9tCiAgICBlbnJpY2hlZC5zdHVkZW50X2RpbQogIAogIHdoZXJlCiAgICBDb250YWN0SURTb3VyY2UgaW4gKAogICAgICBzZWxlY3QKICAgICAgICBkaXN0aW5jdCBDb250YWN0SURTb3VyY2UKICAgICAgZnJvbQogICAgICAgIGNvbnRyYWN0X2... ").decode())
      5   display(df)
      6   return df
 
File /databricks/spark/python/pyspark/instrumentation_utils.py:48, in _wrap_function.<locals>.wrapper(*args, **kwargs)
     46 start = time.perf_counter()
     47 try:
---> 48     res = func(*args, **kwargs)
     49     logger.log_success(
     50         module_name, class_name, function_name, time.perf_counter() - start, signature
     51     )
     52     return res
 
File /databricks/spark/python/pyspark/sql/session.py:1205, in SparkSession.sql(self, sqlQuery, **kwargs)
   1203     sqlQuery = formatter.format(sqlQuery, **kwargs)
   1204 try:
-> 1205     return DataFrame(self._jsparkSession.sql(sqlQuery), self)
   1206 finally:
   1207     if len(kwargs) > 0:
 
File /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)
   1315 command = proto.CALL_COMMAND_NAME +\
   1316     self.command_header +\
   1317     args_command +\
   1318     proto.END_COMMAND_PART
   1320 answer = self.gateway_client.send_command(command)
-> 1321 return_value = get_return_value(
   1322     answer, self.gateway_client, self.target_id, self.name)
   1324 for temp_arg in temp_args:
   1325     temp_arg._detach()
 
File /databricks/spark/python/pyspark/sql/utils.py:209, in capture_sql_exception.<locals>.deco(*a, **kw)
    207 def deco(*a: Any, **kw: Any) -> Any:
    208     try:
--> 209         return f(*a, **kw)
    210     except Py4JJavaError as e:
    211         converted = convert_exception(e.java_exception)
 
File /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
    324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325 if answer[1] == REFERENCE_TYPE:
--> 326     raise Py4JJavaError(
    327         "An error occurred while calling {0}{1}{2}.\n".
    328         format(target_id, ".", name), value)
    329 else:
    330     raise Py4JError(
    331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
    332         format(target_id, ".", name, value))
 
Py4JJavaError: An error occurred while calling o337.sql.
: java.lang.AssertionError: assertion failed

1 ACCEPTED SOLUTION

Accepted Solutions

entongshen__Dat
New Contributor III

Thanks for reporting! We have identified a defect with an early version of DBR 12 related to INSERT INTO .. SELECT when certain query patterns are involved. The defect has since been fixed. Please let us know if you have any additional questions.

View solution in original post

8 REPLIES 8

shan_chandra
Esteemed Contributor

@morten haga​  - could you please try setting the below config and retry?

spark.conf.set("spark.sql.constraintPropagation.enabled", "false")

Hi @Shanmugavel Chandrakasu​ 

Thanks for reaching out! I have pasted the config in a cell above, ran it, but still the same error. If it helps, I can paste in the script here, so please let me know.

Manoj12421
Valued Contributor II

You can raise a ticket regarding the problem to the databricks support and mention about obstacles you are facing ​.

Hi! Yes I would If we had a support package, but we have not so I cant 😞

youssefmrini
Honored Contributor III

Hello, I recommend you open a ticket to the support. It's very important

Hi! Yes I would If we had a support package, but we have not so I cant 😞

entongshen__Dat
New Contributor III

Thanks for reporting! We have identified a defect with an early version of DBR 12 related to INSERT INTO .. SELECT when certain query patterns are involved. The defect has since been fixed. Please let us know if you have any additional questions.

Thanks for finally answering. Then I can safely upgrade and roll-out again.​

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group