"Fatal error: The Python kernel is unresponsive." DBR 14.3
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-05-2024 04:00 PM
Running almost any notebook with a merge statement in Databricks with DBR 14.3 I get the following error and the notebook exists:
"Fatal error: The Python kernel is unresponsive."
I would provide more code, but like I said, it is pretty much anything with a merge statement, except for the tiniest of batches. Anything larger than that fails. Changing the compute to DBR 13.3 resolves the issue. Has anyone else encountered anything similar?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-05-2024 09:17 PM
Hi @ac0 , Hope you are doing well!
The "Fatal error: The Python kernel is unresponsive" error generally means the Jupyter Kernel is still running but is not responsive within a specific time frame.
Please try to increase the timeout by specifying config "spark.databricks.driver.python.pythonHealthCheckTimeoutSec" to 300 (5 min, default is 1.5 min). And let us know if you still see the error after applying the configuration.
Also we do have a kb article around this issue explaining the cause and possible solution to resolve the issue in future.
https://kb.databricks.com/en_US/clusters/python-kernel-is-unresponsive-error-message
Please let me know if this helps and leave a like if this information is useful, followups are appreciated.
Kudos
Ayushi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-06-2024 07:28 AM
Thanks @Ayushi_Suthar. I will try this and report back. Is "spark.databricks.driver.python.pythonHealthCheckTimeoutSec" a new configuration setting? Literally nothing on Google or Stackoverflow is returned when I search by either that full name or even just "pythonHealthCheckTimeoutSec." I want to make sure I have a good understanding of it, so is there any documentation or further information you have that I could review?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-19-2024 11:49 AM
Same thing, not finding any documentation out there around "spark.databricks.driver.python.pythonHealthCheckTimeoutSec". @ac0 or @Ayushi_Suthar any more details you found on this?

