cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks Attribute Error: 'IPythonShell' object has no attribute 'kernel'

abi-tosh
New Contributor III

I have been getting this error repeatedly when trying to run a notebook. I have tried attaching multiple different clusters and installing some of the libraries that it wanted me to update. I have also tried to clear the state of the notebook and restarted the clusters, but couldn't get rid of the error. Any tips to solve this?

6 REPLIES 6

Lakshay
Esteemed Contributor

I found this link:https://github.com/ipython/ipython/issues/10377.

Can you check if this is helpful?

abi-tosh
New Contributor III
AttributeError: 'IPythonShell' object has no attribute 'kernel'
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<command-2848973972456360> in <module>
      3 import pmdarima as pm
      4 import matplotlib.pyplot as plt
----> 5 import seaborn as sns
      6 import joblib
      7 from sklearn.model_selection import TimeSeriesSplit
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If youโ€™re seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/seaborn/__init__.py in <module>
     10 from .miscplot import *  # noqa: F401,F403
     11 from .axisgrid import *  # noqa: F401,F403
---> 12 from .widgets import *  # noqa: F401,F403
     13 from .colors import xkcd_rgb, crayons  # noqa: F401
     14 from . import cm  # noqa: F401
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If youโ€™re seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/seaborn/widgets.py in <module>
      5 # Lots of different places that widgets could come from...
      6 try:
----> 7     from ipywidgets import interact, FloatSlider, IntSlider
      8 except ImportError:
      9     import warnings
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If youโ€™re seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in <module>
     57     register_comm_target()
     58 
---> 59 _handle_ipython()
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in _handle_ipython()
     55     if ip is None:
     56         return
---> 57     register_comm_target()
     58 
     59 _handle_ipython()
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in register_comm_target(kernel)
     45 def register_comm_target(kernel=None):
     46     """Register the jupyter.widget comm target"""
---> 47     comm_manager = get_comm_manager()
     48 
     49     comm_manager.register_target('jupyter.widget', Widget.handle_comm_opened)
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in get_comm_manager()
     31         ip = get_ipython()
     32 
---> 33         if ip is not None and ip.kernel is not None:
     34             return get_ipython().kernel.comm_manager
     35 

This is the code

TheSurestBlackE
New Contributor II

@Lakshay Goelโ€‹ I ran into the same issue as abi-tosh and I don't think I am running any unit tests. The thing that trips me up about this is that we had this working in other jobs and it is failing for us now. It fails for us on

from great_expectations.core import (

ExpectationValidationResult,

ExpectationSuiteValidationResult,

)

I have bugged several people at my company and we are unsure how to tackle this. Please help.

Here is the error I am getting. Very similar every time

TheSurestBlackE
New Contributor II
      2 from typing import Union, Dict, List
      3 
----> 4 from great_expectations.core.expectation_validation_result import ExpectationSuiteValidationResult, ExpectationValidationResult
      5 from great_expectations.dataset import SparkDFDataset
      6 from pyspark.sql import DataFrame, SparkSession
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/__init__.py in <module>
      4 __version__ = get_versions()["version"]  # isort:skip
      5 
----> 6 from great_expectations.data_context.migrator.cloud_migrator import CloudMigrator
      7 
      8 del get_versions  # isort:skip
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/data_context/__init__.py in <module>
----> 1 from great_expectations.data_context.data_context import (
      2     AbstractDataContext,
      3     BaseDataContext,
      4     CloudDataContext,[0m
      5     DataContext,
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/data_context/data_context/__init__.py in <module>
----> 1 from great_expectations.data_context.data_context.abstract_data_context import (
      2     AbstractDataContext,
      3 )
      4 from great_expectations.data_context.data_context.base_data_context import (
      5     BaseDataContext,
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/data_context/data_context/abstract_data_context.py in <module>
    116 from great_expectations.datasource.new_datasource import BaseDatasource, Datasource
    117 from great_expectations.profile.basic_dataset_profiler import BasicDatasetProfiler
--> 118 from great_expectations.rule_based_profiler.data_assistant.data_assistant_dispatcher import (
    119     DataAssistantDispatcher,
    120 )
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)[0;34m
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/rule_based_profiler/data_assistant/__init__.py in <module>
----> 1 from .data_assistant import DataAssistant
      2 from .onboarding_data_assistant import OnboardingDataAssistant
      3 from .volume_data_assistant import VolumeDataAssistant
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/rule_based_profiler/data_assistant/data_assistant.py in <module>
     20 )
     21 from great_expectations.rule_based_profiler.config import ParameterBuilderConfig
---> 22 from great_expectations.rule_based_profiler.data_assistant_result import (
     23     DataAssistantResult,
     24 )
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/rule_based_profiler/data_assistant_result/__init__.py in <module>
----> 1 from .data_assistant_result import DataAssistantResult
      2 from .onboarding_data_assistant_result import OnboardingDataAssistantResult
      3 from .volume_data_assistant_result import VolumeDataAssistantResult
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/great_expectations/rule_based_profiler/data_assistant_result/data_assistant_result.py in <module>
     21 
     22 import altair as alt
---> 23 import ipywidgets as widgets
     24 import numpy as np
     25 import pandas as pd
 
/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    165             # Import the desired module. If you're seeing this while debugging a failed import,
    166             # look at preceding stack frames for relevant error information.
--> 167             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    168 
    169             is_root_import = thread_local._nest_level == 1
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in <module>
     57     register_comm_target()
     58 
---> 59 _handle_ipython()
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in _handle_ipython()
     55     if ip is None:
     56         return
---> 57     register_comm_target()
     58 
     59 _handle_ipython()
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in register_comm_target(kernel)
     45 def register_comm_target(kernel=None):
     46     """Register the jupyter.widget comm target"""
---> 47     comm_manager = get_comm_manager()
     48 
     49     comm_manager.register_target('jupyter.widget', Widget.handle_comm_opened)
 
/databricks/python/lib/python3.8/site-packages/ipywidgets/__init__.py in get_comm_manager()
     31         ip = get_ipython()
     32 
---> 33         if ip is not None and ip.kernel is not None:
     34             return get_ipython().kernel.comm_manager
     35 
 
AttributeError: 'IPythonShell' object has no attribute 'kernel'

TheSurestBlackE
New Contributor II

Found it, great expectations released a new feature that put us in dependency hell. 0.16.3 makes life difficult for some people.

Anonymous
Not applicable

Hi @Toshali Mohapatraโ€‹ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group