Hi community,
I have a question regarding an error that I get sometimes when running a job.
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007fc941e74996, pid=940, tid=0x00007fc892dff640
#
# JRE version: OpenJDK Runtime Environment (Zulu 8.72.0.17-CA-linux64) (8.0_382-b05) (build 1.8.0_382-b05)
# Java VM: OpenJDK 64-Bit Server VM (25.382-b05 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# J 33375 C2 org.apache.spark.unsafe.types.UTF8String.hashCode()I (18 bytes) @ 0x00007fc941e74996 [0x00007fc941e748a0+0xf6]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /databricks/driver/hs_err_pid940.log
#
# If you would like to submit a bug report, please visit:
# http://www.azul.com/support/
#
Why do I get this error and why does the job run when I restart it? Is there a possibility to remediate this issue?