โ02-12-2024 08:48 AM
We are intermittently experiencing the below issue when running mundane code in our databricks notebook environment using 13.3 LTS runtime, with a compute pool with r6id.large on-demand instances, using local storage.
We first noticed this late last week, around the same time as the 14.3 LTS release.
Observed error messages:
py4j.Py4JException: An exception was raised by the Python Proxy. Return Message: Traceback (most recent call last):
File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 617, in _call_proxy
return_value = getattr(self.pool[obj_id], method)(*params)
File "/databricks/python_shell/dbruntime/pythonPathHook.py", line 118, in initStartingDirectory
self._handle_sys_path_maybe_updated()
File "/databricks/python_shell/dbruntime/pythonPathHook.py", line 90, in _handle_sys_path_maybe_updated
self._restart_language_server_if_needed()
File "/databricks/python_shell/dbruntime/pythonPathHook.py", line 85, in _restart_language_server_if_needed
ls_manager.restart()
File "/databricks/python_shell/dbruntime/lsp_backend/lsp_manager.py", line 348, in restart
self.start()
File "/databricks/python_shell/dbruntime/lsp_backend/lsp_manager.py", line 290, in start
self.server_process = subprocess.Popen(
File "/usr/lib/python3.10/subprocess.py", line 971, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/lib/python3.10/subprocess.py", line 1863, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'pylsp'
at py4j.Protocol.getReturnValue(Protocol.java:476)
at py4j.reflection.PythonProxyHandler.invoke(PythonProxyHandler.java:108)
at com.sun.proxy.$Proxy70.initStartingDirectory(Unknown Source)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.$anonfun$setWsfsWorkingDir$1(PythonDriverLocalBase.scala:316)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.$anonfun$setWsfsWorkingDir$1$adapted(PythonDriverLocalBase.scala:307)
at scala.Option.foreach(Option.scala:407)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.setWsfsWorkingDir(PythonDriverLocalBase.scala:307)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.addLibrariesToPythonPath(PythonDriverLocalBase.scala:244)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.setUpBeforeCommandRun(JupyterDriverLocal.scala:628)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.$anonfun$executePython$1(JupyterDriverLocal.scala:814)
at com.databricks.backend.daemon.driver.JupyterKernelListener.preExecuteCommand(JupyterKernelListener.scala:951)
at com.databricks.backend.daemon.driver.JupyterKernelListener.executeCommand(JupyterKernelListener.scala:1080)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.executePython(JupyterDriverLocal.scala:814)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.repl(JupyterDriverLocal.scala:697)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$33(DriverLocal.scala:997)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$22(DriverLocal.scala:980)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:69)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:69)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:935)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:798)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:790)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:643)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:744)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:520)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:436)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:279)
at java.lang.Thread.run(Thread.java:750)
Additional related error messages:
py4j.Py4JException: Error while sending a command.
at py4j.CallbackClient.sendCommand(CallbackClient.java:397)
at py4j.CallbackClient.sendCommand(CallbackClient.java:356)
at py4j.reflection.PythonProxyHandler.invoke(PythonProxyHandler.java:106)
at com.sun.proxy.$Proxy58.updateWsfsWorkingDir(Unknown Source)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.$anonfun$setWsfsWorkingDir$1(PythonDriverLocalBase.scala:318)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.$anonfun$setWsfsWorkingDir$1$adapted(PythonDriverLocalBase.scala:307)
at scala.Option.foreach(Option.scala:407)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.setWsfsWorkingDir(PythonDriverLocalBase.scala:307)
at com.databricks.backend.daemon.driver.PythonDriverLocalBase.addLibrariesToPythonPath(PythonDriverLocalBase.scala:244)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.setUpBeforeCommandRun(JupyterDriverLocal.scala:628)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.$anonfun$executePython$1(JupyterDriverLocal.scala:814)
at com.databricks.backend.daemon.driver.JupyterKernelListener.preExecuteCommand(JupyterKernelListener.scala:951)
at com.databricks.backend.daemon.driver.JupyterKernelListener.executeCommand(JupyterKernelListener.scala:1080)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.executePython(JupyterDriverLocal.scala:814)
at com.databricks.backend.daemon.driver.JupyterDriverLocal.repl(JupyterDriverLocal.scala:697)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$33(DriverLocal.scala:997)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$22(DriverLocal.scala:980)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:69)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:69)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:935)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:798)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:790)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:643)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:744)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:520)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:436)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:279)
at java.lang.Thread.run(Thread.java:750)
Caused by: py4j.Py4JNetworkException: Error while sending a command: c
p0
updateWsfsWorkingDir
s/Workspace/Users/[user omitted]
e
at py4j.ClientServerConnection.sendCommand(ClientServerConnection.java:266)
at py4j.CallbackClient.sendCommand(CallbackClient.java:384)
... 37 more
Caused by: py4j.Py4JException: Received empty command
at py4j.ClientServerConnection.sendCommand(ClientServerConnection.java:249)
... 38 more
โ02-13-2024 11:42 PM
Hi Dylan,
Is your cluster using DCS? If so we have an ongoing issue which may be related to the error you are facing. While we work on a fix, can you try the below workaround to resolve the issue:
Spin up the cluster using the below custom image:
`````
custom:release__13.3.x-snapshot-scala2.12__databricks-universe__13.3.8__4966bd2__0f3d13f__jenkins__a041514__format-3
`````
How to spin up a cluster using a custom image?
Follow this: https://kb.databricks.com/en_US/clusters/run-a-custom-databricks-runtime-on-your-cluster
โ02-14-2024 07:28 AM
We are using DCS, yes. I'll have a look at this workaround, thanks!
โ02-14-2024 07:30 AM - edited โ02-14-2024 07:31 AM
Hi Navya,
using the provided custom image fixes the issue in my case. Can you give an estimation on when the issue might be resolved / how long the custom image will be available? Also, is 14.3 LTS also impacted by this same issue?
โ02-15-2024 12:59 AM
Hi Vogeljo,
Yes DBR 14.3 is impacted as well. The fix for the workspaces will be gradually rolled out from Feb 26 to March 4. You can continue to use this custom image until then.
โ03-07-2024 05:50 AM
โ03-19-2024 11:56 AM
Hello @Navya_R ,
We are facing a similar issue when using 14.3LTS with DCS
For us, certain Global Inits are not getting applied. Is there a patch we can use for 14.3 LTS as well?
โ03-20-2024 12:59 AM
The solution I found to my problem is was to add this at the end of my dockerfile:
ENV LANG="C.UTF-8"
ENV LC_ALL="C.UTF-8"
Apparently, my issue was caused by a bad locale in the DBC docker image.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group