cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

"Fatal error: The Python kernel is unresponsive." DBR 14.3

ac0
Contributor

Running almost any notebook with a merge statement in Databricks with DBR 14.3 I get the following error and the notebook exists:

"Fatal error: The Python kernel is unresponsive."

I would provide more code, but like I said, it is pretty much anything with a merge statement, except for the tiniest of batches. Anything larger than that fails. Changing the compute to DBR 13.3 resolves the issue. Has anyone else encountered anything similar?

3 REPLIES 3

Ayushi_Suthar
Databricks Employee
Databricks Employee

Hi @ac0 , Hope you are doing well! 

The "Fatal error: The Python kernel is unresponsive" error generally means the Jupyter Kernel is still running but is not responsive within a specific time frame.

Please try to increase the timeout by specifying config "spark.databricks.driver.python.pythonHealthCheckTimeoutSec" to 300 (5 min, default is 1.5 min). And let us know if you still see the error after applying the configuration.

Also we do have a kb article around this issue explaining the cause and possible solution to resolve the issue in future.
https://kb.databricks.com/en_US/clusters/python-kernel-is-unresponsive-error-message

Please let me know if this helps and leave a like if this information is useful, followups are appreciated.
Kudos
Ayushi

Thanks @Ayushi_Suthar. I will try this and report back. Is "spark.databricks.driver.python.pythonHealthCheckTimeoutSec" a new configuration setting? Literally nothing on Google or Stackoverflow is returned when I search by either that full name or even just "pythonHealthCheckTimeoutSec." I want to make sure I have a good understanding of it, so is there any documentation or further information you have that I could review?

markthepaz
New Contributor

Same thing, not finding any documentation out there around "spark.databricks.driver.python.pythonHealthCheckTimeoutSec". @ac0 or @Ayushi_Suthar any more details you found on this?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group