Hi @ac0 , Hope you are doing well!
The "Fatal error: The Python kernel is unresponsive" error generally means the Jupyter Kernel is still running but is not responsive within a specific time frame.
Please try to increase the timeout by specifying config "spark.databricks.driver.python.pythonHealthCheckTimeoutSec" to 300 (5 min, default is 1.5 min). And let us know if you still see the error after applying the configuration.
Also we do have a kb article around this issue explaining the cause and possible solution to resolve the issue in future.
https://kb.databricks.com/en_US/clusters/python-kernel-is-unresponsive-error-message
Please let me know if this helps and leave a like if this information is useful, followups are appreciated.
Kudos
Ayushi