In the last run there has been additional information in the error message:
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007f168e094210, pid=1002, tid=0x00007f15dd1ff640
#
# JRE version: OpenJDK Runtime Environment (Zulu 8.72.0.17-CA-linux64) (8.0_382-b05) (build 1.8.0_382-b05)
# Java VM: OpenJDK 64-Bit Server VM (25.382-b05 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# J 33949 C2 org.apache.spark.unsafe.types.UTF8String.hashCode()I (18 bytes) @ 0x00007f168e094210 [0x00007f168e0940e0+0x130]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /databricks/driver/hs_err_pid1002.log
Compiled method (c2) 335210 34051 4 org.apache.spark.sql.catalyst.util.ArrayData::foreach (64 bytes)
total in heap [0x00007f168d6b4610,0x00007f168d6b52b0] = 3232
relocation [0x00007f168d6b4738,0x00007f168d6b47b8] = 128
main code [0x00007f168d6b47c0,0x00007f168d6b4ca0] = 1248
stub code [0x00007f168d6b4ca0,0x00007f168d6b4ce8] = 72
oops [0x00007f168d6b4ce8,0x00007f168d6b4d00] = 24
metadata [0x00007f168d6b4d00,0x00007f168d6b4dc0] = 192
scopes data [0x00007f168d6b4dc0,0x00007f168d6b50f8] = 824
scopes pcs [0x00007f168d6b50f8,0x00007f168d6b51d8] = 224
dependencies [0x00007f168d6b51d8,0x00007f168d6b51e8] = 16
handler table [0x00007f168d6b51e8,0x00007f168d6b5290] = 168
nul chk table [0x00007f168d6b5290,0x00007f168d6b52b0] = 32
#
# If you would like to submit a bug report, please visit:
# http://www.azul.com/support/
#