cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

New error: middleware.base:exception while intercepting server message

Kayla
Valued Contributor II

We started getting a very weird error at random from Databricks. This is from cells that routinely work, and after it happens once it will happen on every cell. It appears to be including full text of a .py file we're importing, that I've had to remove here.

This is the output from a Python cell with nothing but print(1) in it:

 

1
ERROR:dbruntime.lsp_backend.middleware.base:exception while intercepting server message: {'jsonrpc': '2.0', 'method': 'textDocument/didOpen', 'params': {'cellRanges': [{'startLine': 0, 'stopLine': 12, 'commandId': 2987329298861966}, {'startLine': 12, 'stopLine': 26, 'commandId': 7910971740075064}, {'startLine': 26, 'stopLine': 36, 'commandId': 2987329298861968}, {'startLine': 36, 'stopLine': 74, 'commandId': 2987329298861969}, {'startLine': 74, 'stopLine': 90, 'commandId': 2987329298861970}, {'startLine': 90, 'stopLine': 95, 'commandId': 2987329298861971}, {'startLine': 95, 'stopLine': 97, 'commandId': 7910971740075065}, {'startLine': 97, 'stopLine': 99, 'commandId': 2987329298861972}, {'startLine': 99, 'stopLine': 107, 'commandId': 2987329298861973}, {'startLine': 107, 'stopLine': 111, 'commandId': 7910971740075063}, {'startLine': 111, 'stopLine': 113, 'commandId': 2987329298861974}, {'startLine': 113, 'stopLine': 114, 'commandId': 2987329298861975}, {'startLine': 114, 'stopLine': 115, 'commandId': 2987329298861976}, {'startLine': 115, 'stopLine': 122, 'commandId': 2987329298861977}, {'startLine': 122, 'stopLine': 129, 'commandId': 2987329298861978}, {'startLine': 129, 'stopLine': 131, 'commandId': 2987329298861979}, {'startLine': 131, 'stopLine': 134, 'commandId': 2987329298861980}, {'startLine': 134, 'stopLine': 155, 'commandId': 2987329298861981}, {'startLine': 155, 'stopLine': 156, 'commandId': 2987329298861982}, {'startLine': 156, 'stopLine': 157, 'commandId': 2987329298861983}, {'startLine': 157, 'stopLine': 158, 'commandId': 2987329298861984}], 'cursorPosition': {'line': 95, 'character': 0}, 'textDocument': {'uri': '/notebook/e0152433-fae9-421b-955e-55303a1ff30b/2987329298861965', 
Traceback (most recent call last):
  File "/databricks/python_shell/dbruntime/lsp_backend/middleware/base.py", line 54, in intercept_message_to_server_safe
    return self.intercept_message_to_server(msg)
  File "/databricks/python_shell/dbruntime/lsp_backend/middleware/pyspark_select_diagnostics.py", line 34, in intercept_message_to_server
    diagnostics += lint(self.parse(cell), getattr(self.shell, "user_ns", None), start)
  File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 174, in lint
    diagnostics += make_diagnostics(select_string_arg_nodes, df_name, list(spark_df.columns),
  File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 128, in make_diagnostics
    select_arg_node_value = eval(select_arg_node.get_code())
  File "<string>", line 2
    'netsuite_internal_id'
IndentationError: unexpected indent

 

Our current suspect is the newer GCE clusters on GCP. Was wondering if anyone else has run into anything like this before.

14 REPLIES 14

Kayla
Valued Contributor II

Tried with other cluster configurations, GCE / GKE, doesn't seem to matter. 

ElNino
New Contributor II

Similar error here. also started this morning
Traceback (most recent call last): File "/databricks/python_shell/dbruntime/lsp_backend/middleware/base.py", line 54, in intercept_message_to_server_safe return self.intercept_message_to_server(msg) File "/databricks/python_shell/dbruntime/lsp_backend/middleware/pyspark_select_diagnostics.py", line 34, in intercept_message_to_server diagnostics += lint(self.parse(cell), getattr(self.shell, "user_ns", None), start) File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 174, in lint diagnostics += make_diagnostics(select_string_arg_nodes, df_name, list(spark_df.columns), File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 128, in make_diagnostics select_arg_node_value = eval(select_arg_node.get_code()) File "<string>", line 2 'channel_order_number' IndentationError: unexpected indent

arun_qburst
New Contributor II

Similar error for me 
```

ERROR:dbruntime.lsp_backend.middleware.base:exception while intercepting server message: {'jsonrpc': '2.0', 'method': 'textDocument/didOpen', 'params': {'cellRanges': [{'startLine': 0, 'stopLine': 6, 'commandId': 5146433554673168}, {'startLine': 6, 'stopLine': 8, 'commandId': 5146433554673169}, {'startLine': 8, 'stopLine': 62, 'commandId': 5146433554673170}, {'startLine': 62, 'stopLine': 64, 'commandId': 5146433554673171}, {'startLine': 64, 'stopLine': 79, 'commandId': 5146433554673172}, {'startLine': 79, 'stopLine': 90, 'commandId': 5146433554673173},
Traceback (most recent call last):
File "/databricks/python_shell/lib/dbruntime/lsp_backend/middleware/base.py", line 54, in intercept_message_to_server_safe
return self.intercept_message_to_server(msg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/python_shell/lib/dbruntime/lsp_backend/middleware/pyspark_select_diagnostics.py", line 34, in intercept_message_to_server
diagnostics += lint(self.parse(cell), getattr(self.shell, "user_ns", None), start)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/python_shell/lib/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 170, in lint
df_name = get_dataframe_name(node)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/python_shell/lib/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 98, in get_dataframe_name
return node.children[0].value # type: ignore [attr-defined]
^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'PythonNode' object has no attribute 'value'
```

GeoPer
New Contributor III

Starting this morning we also face the same problem:
To resolve it we downgraded the cluster's DBR from 15.4 to 14.3. And everything worked for a while. After an hour the problem occured again.

 

GeoPer
New Contributor III

Furthermore, after this error, if you try to format (CTRL+SHIFT+F) any cell, even if it only contains a simple command like

 

print("something")

 

, it shows the following message:

Format cell failed
Python indentation error(s) exist in the cell. Please fix the error(s) before formatting

bue
New Contributor II

Same story here with 15.4 LTS.

AnshulJain
New Contributor II

Getting Same issue with Runtime 16.1.

TKr
New Contributor II

Hey everybody - sorry that you experienced these issues. We identified the issue and reverted the feature causing it. Things should be back to normal already.

berti99
New Contributor II

For 16.0 cluster issue still arises

TKr
New Contributor II

The issue should be a frontend only issue. Can you hard refresh your browser page and let me know if that doesn't help?

berti99
New Contributor II

Refresh didnt help. I restarted cluster and now it works

TKr
New Contributor II

I'm curious. Can anyone let me now if you used code like this in your notebook?

import logging
logger = logging.getLogger(__name__)
logger.info("...")

This helps me understand the issue. 

It would be even better if you could share the code in the notebook, but only if it doesn't contain PII.

NandiniN
Databricks Employee
Databricks Employee

It is failing with 

IndentationError: unexpected indent

 The error message points to the line containing 'netsuite_internal_id'. Kindly locate this line in your code.

Kayla
Valued Contributor II

@TKr 
Hey everybody - sorry that you experienced these issues. We identified the issue and reverted the feature causing it. Things should be back to normal already.
I'm glad to hear that. Are you a Databricks employee?
Referring to your question, we didn't use any loggers at all or use __name__ in our scripts.

@NandiniN If you follow the stack trace, you'll notice it's pointing to that inside the file "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py" which I've cat-ed and that line is completely not in there. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group