New error: middleware.base:exception while intercepting server message
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-24-2025 07:59 AM
We started getting a very weird error at random from Databricks. This is from cells that routinely work, and after it happens once it will happen on every cell. It appears to be including full text of a .py file we're importing, that I've had to remove here.
This is the output from a Python cell with nothing but print(1) in it:
1
ERROR:dbruntime.lsp_backend.middleware.base:exception while intercepting server message: {'jsonrpc': '2.0', 'method': 'textDocument/didOpen', 'params': {'cellRanges': [{'startLine': 0, 'stopLine': 12, 'commandId': 2987329298861966}, {'startLine': 12, 'stopLine': 26, 'commandId': 7910971740075064}, {'startLine': 26, 'stopLine': 36, 'commandId': 2987329298861968}, {'startLine': 36, 'stopLine': 74, 'commandId': 2987329298861969}, {'startLine': 74, 'stopLine': 90, 'commandId': 2987329298861970}, {'startLine': 90, 'stopLine': 95, 'commandId': 2987329298861971}, {'startLine': 95, 'stopLine': 97, 'commandId': 7910971740075065}, {'startLine': 97, 'stopLine': 99, 'commandId': 2987329298861972}, {'startLine': 99, 'stopLine': 107, 'commandId': 2987329298861973}, {'startLine': 107, 'stopLine': 111, 'commandId': 7910971740075063}, {'startLine': 111, 'stopLine': 113, 'commandId': 2987329298861974}, {'startLine': 113, 'stopLine': 114, 'commandId': 2987329298861975}, {'startLine': 114, 'stopLine': 115, 'commandId': 2987329298861976}, {'startLine': 115, 'stopLine': 122, 'commandId': 2987329298861977}, {'startLine': 122, 'stopLine': 129, 'commandId': 2987329298861978}, {'startLine': 129, 'stopLine': 131, 'commandId': 2987329298861979}, {'startLine': 131, 'stopLine': 134, 'commandId': 2987329298861980}, {'startLine': 134, 'stopLine': 155, 'commandId': 2987329298861981}, {'startLine': 155, 'stopLine': 156, 'commandId': 2987329298861982}, {'startLine': 156, 'stopLine': 157, 'commandId': 2987329298861983}, {'startLine': 157, 'stopLine': 158, 'commandId': 2987329298861984}], 'cursorPosition': {'line': 95, 'character': 0}, 'textDocument': {'uri': '/notebook/e0152433-fae9-421b-955e-55303a1ff30b/2987329298861965',
Traceback (most recent call last):
File "/databricks/python_shell/dbruntime/lsp_backend/middleware/base.py", line 54, in intercept_message_to_server_safe
return self.intercept_message_to_server(msg)
File "/databricks/python_shell/dbruntime/lsp_backend/middleware/pyspark_select_diagnostics.py", line 34, in intercept_message_to_server
diagnostics += lint(self.parse(cell), getattr(self.shell, "user_ns", None), start)
File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 174, in lint
diagnostics += make_diagnostics(select_string_arg_nodes, df_name, list(spark_df.columns),
File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 128, in make_diagnostics
select_arg_node_value = eval(select_arg_node.get_code())
File "<string>", line 2
'netsuite_internal_id'
IndentationError: unexpected indent
Our current suspect is the newer GCE clusters on GCP. Was wondering if anyone else has run into anything like this before.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-24-2025 08:34 AM
Tried with other cluster configurations, GCE / GKE, doesn't seem to matter.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-25-2025 06:43 AM
Similar error here. also started this morning
Traceback (most recent call last): File "/databricks/python_shell/dbruntime/lsp_backend/middleware/base.py", line 54, in intercept_message_to_server_safe return self.intercept_message_to_server(msg) File "/databricks/python_shell/dbruntime/lsp_backend/middleware/pyspark_select_diagnostics.py", line 34, in intercept_message_to_server diagnostics += lint(self.parse(cell), getattr(self.shell, "user_ns", None), start) File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 174, in lint diagnostics += make_diagnostics(select_string_arg_nodes, df_name, list(spark_df.columns), File "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 128, in make_diagnostics select_arg_node_value = eval(select_arg_node.get_code()) File "<string>", line 2 'channel_order_number' IndentationError: unexpected indent
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 01:13 AM
Similar error for me
```
ERROR:dbruntime.lsp_backend.middleware.base:exception while intercepting server message: {'jsonrpc': '2.0', 'method': 'textDocument/didOpen', 'params': {'cellRanges': [{'startLine': 0, 'stopLine': 6, 'commandId': 5146433554673168}, {'startLine': 6, 'stopLine': 8, 'commandId': 5146433554673169}, {'startLine': 8, 'stopLine': 62, 'commandId': 5146433554673170}, {'startLine': 62, 'stopLine': 64, 'commandId': 5146433554673171}, {'startLine': 64, 'stopLine': 79, 'commandId': 5146433554673172}, {'startLine': 79, 'stopLine': 90, 'commandId': 5146433554673173},
Traceback (most recent call last):
File "/databricks/python_shell/lib/dbruntime/lsp_backend/middleware/base.py", line 54, in intercept_message_to_server_safe
return self.intercept_message_to_server(msg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/python_shell/lib/dbruntime/lsp_backend/middleware/pyspark_select_diagnostics.py", line 34, in intercept_message_to_server
diagnostics += lint(self.parse(cell), getattr(self.shell, "user_ns", None), start)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/python_shell/lib/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 170, in lint
df_name = get_dataframe_name(node)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/databricks/python_shell/lib/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py", line 98, in get_dataframe_name
return node.children[0].value # type: ignore [attr-defined]
^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'PythonNode' object has no attribute 'value'
```
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 01:32 AM - edited 01-27-2025 01:53 AM
Starting this morning we also face the same problem:
To resolve it we downgraded the cluster's DBR from 15.4 to 14.3. And everything worked for a while. After an hour the problem occured again.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 02:07 AM
Furthermore, after this error, if you try to format (CTRL+SHIFT+F) any cell, even if it only contains a simple command like
print("something")
, it shows the following message:
Format cell failed
Python indentation error(s) exist in the cell. Please fix the error(s) before formatting
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 06:00 AM
Same story here with 15.4 LTS.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-27-2025 08:36 AM
Getting Same issue with Runtime 16.1.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 01:59 AM
Hey everybody - sorry that you experienced these issues. We identified the issue and reverted the feature causing it. Things should be back to normal already.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 04:46 AM
For 16.0 cluster issue still arises
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 05:32 AM
The issue should be a frontend only issue. Can you hard refresh your browser page and let me know if that doesn't help?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 05:46 AM
Refresh didnt help. I restarted cluster and now it works
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-29-2025 04:59 AM
I'm curious. Can anyone let me now if you used code like this in your notebook?
import logging
logger = logging.getLogger(__name__)
logger.info("...")
This helps me understand the issue.
It would be even better if you could share the code in the notebook, but only if it doesn't contain PII.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-31-2025 10:38 PM
It is failing with
IndentationError: unexpected indent
The error message points to the line containing 'netsuite_internal_id'
. Kindly locate this line in your code.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2025 07:21 AM
@TKr
Hey everybody - sorry that you experienced these issues. We identified the issue and reverted the feature causing it. Things should be back to normal already.
I'm glad to hear that. Are you a Databricks employee?
Referring to your question, we didn't use any loggers at all or use __name__ in our scripts.
@NandiniN If you follow the stack trace, you'll notice it's pointing to that inside the file "/databricks/python_shell/dbruntime/lsp_backend/pyspark_select_diagnostics_utils.py" which I've cat-ed and that line is completely not in there.

