cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Data_Analytics1
by Contributor III
  • 8066 Views
  • 17 replies
  • 24 kudos

Fatal error: The Python kernel is unresponsive.

I am using MultiThread in this job which creates 8 parallel jobs. It fails for few times in a day and sometimes stuck in any of the Python notebook cell process. Here The Python process exited with an unknown exit code.The last 10 KB of the process's...

  • 8066 Views
  • 17 replies
  • 24 kudos
Latest Reply
luis_herrera
New Contributor III
  • 24 kudos

Hey, it seems that the issue is related to the driver undergoing a memory bottleneck, which causes it to crash with an out of memory (OOM) condition and gets restarted or becomes unresponsive due to frequent full garbage collection. The reason for th...

  • 24 kudos
16 More Replies
Harsh_Paliwal
by New Contributor
  • 1267 Views
  • 1 replies
  • 0 kudos

java.lang.Exception: Unable to start python kernel for ReplId-79217-e05fc-0a4ce-2, kernel exited with exit code 1.

I am running a parameterized autoloader notebook in a workflow.This notebook is being called 29 times in parallel, and FYI UC is also enabled.I am facing this error:java.lang.Exception: Unable to start python kernel for ReplId-79217-e05fc-0a4ce-2, ke...

image
  • 1267 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Harsh Paliwal​ :The error message suggests that there might be a conflict with the xtables lock.One thing you could try is to add the -w option as suggested by the error message. You can add the following command to the beginning of your notebook t...

  • 0 kudos
kll
by New Contributor III
  • 1716 Views
  • 1 replies
  • 0 kudos

Fatal error: The Python kernel is unresponsive when attempting to query data from AWS Redshift within Jupyter notebook

I am running jupyter notebook on a cluster with configuration: 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)Worker type: i3.xlarge 30.5gb memory, 4 coresMin 2 and max 8 workers cursor = conn.cursor()   cursor.execute( """ ...

  • 1716 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please confirm the usage of your cluster while running this job? you can monitor the performance here: https://docs.databricks.com/clusters/clusters-manage.html#monitor-performance with different metrics. Also, please tag @Debayan​ with...

  • 0 kudos
dhanu
by New Contributor
  • 922 Views
  • 2 replies
  • 0 kudos

Fatal error: Python kernel is unresponsive

i have submitted around 90 job at a time to databricks, the job was running continuously for 2 hours after that i am getting fatal error Pyhon kernel is unresponsive.I am using Databricks runtime version : 11.2Cluster Configuration Details are given...

  • 922 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Hi @Dhanaraj Jogihalli​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 0 kudos
1 More Replies
Valon98
by New Contributor III
  • 10194 Views
  • 11 replies
  • 5 kudos

Resolved! During execution of a cell "RuntimeException: The python kernel is unresponsive."

Hi all, I am running a preprocessing to create my trainset and test set. Does anyone know why during the execution my cell gives the error "RuntimeException: The python kernel is unresponsive." ? How can I solve it?

  • 10194 Views
  • 11 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hey there @Valerio Goretti​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from...

  • 5 kudos
10 More Replies
Orianh
by Valued Contributor II
  • 17262 Views
  • 25 replies
  • 35 kudos

Fatal error: Python kernel is unresponsive

Hey guys, I'm using petastorm to train DNN, First i convert spark df with make_spark_convertor and then open a reader on the materialized dataset.While i start training session only on subset of the data every thing works fine but when I'm using all...

  • 17262 Views
  • 25 replies
  • 35 kudos
Latest Reply
Anonymous
Not applicable
  • 35 kudos

Same error. This started a few days ago on notebooks that used to run fine in the past. Now, I cannot finish a notebook.I have already disabled almost all output being streamed to the result buffer, but the problem persists. I am left with <50 lines ...

  • 35 kudos
24 More Replies
SusuTheSeeker
by New Contributor III
  • 2241 Views
  • 8 replies
  • 3 kudos

Kernel switches to unknown using pyspark

I am working in jupyter hub in a notebook. I am using pyspark dataframe for analyzing text. More precisely I am doing sentimment analysis of newspaper articles. The code works until I get to some point where the kernel is busy and after approximately...

  • 2241 Views
  • 8 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @Suad Hidbani​ ​, We haven’t heard from you on the last responses from us, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will...

  • 3 kudos
7 More Replies
Labels