cancel
Showing results for 
Search instead for 
Did you mean: 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results for 
Search instead for 
Did you mean: 

Importing LanceDB Library Crashes Python Driver

txti
New Contributor III

Hello, I am unable to import the LanceDB library.

My environment is configured as follows:

  • Single Node of type g4dn.xlarge
  • DBR 16.1 ML for GPU

Code to Reproduce:

 

%pip install lancedb==0.17.0
dbutils.library.restartPython()

# Crashes on import
import lancedb

 


The error details from the driver logs appear below.

Thank you in advance for your help.

 

25/01/10 18:49:01 INFO PythonDriverLocalBase$RedirectThread: Python RedirectThread exit
25/01/10 18:49:01 INFO PythonDriverLocalBase$RedirectThread: Python RedirectThread exit
25/01/10 18:49:01 INFO ReplCrashUtils$: python shell exit code: 134; replId: ReplId-19450-b9978-b, pid: 69559
25/01/10 18:49:01 INFO ReplCrashUtils$: strace is not enabled. To turn it on, set the Spark conf `spark.databricks.driver.strace.enabled` to true.
25/01/10 18:49:01 INFO MlflowAutologEventPublisher$: Subscriber with repl ID ReplId-19450-b9978-b not responding to health checks, removing it
25/01/10 18:49:03 INFO ProgressReporter$: Removed result fetcher for 1736521127819_8384873761817337603_8ac6bfae05e44a7a97380a2a98919425
25/01/10 18:49:03 INFO PythonDriverWrapper: Repl ReplInfo(driverReplId=ReplId-19450-b9978-b, chauffeurReplId=ReplId-19450-b9978-b,
 executionContextId=Some(ExecutionContextIdV2(4019583506594996775)), lazyInfoInitialized=true) got an exception during execution
com.databricks.backend.common.rpc.SparkDriverExceptions$ReplStateException
    at com.databricks.backend.daemon.driver.JupyterKernelListener.waitForExecution(JupyterKernelListener.scala:1299)
    at com.databricks.backend.daemon.driver.JupyterKernelListener.executeCommand(JupyterKernelListener.scala:1350)
    at com.databricks.backend.daemon.driver.JupyterDriverLocal.executePython(JupyterDriverLocal.scala:1355)
    at com.databricks.backend.daemon.driver.JupyterDriverLocal.repl(JupyterDriverLocal.scala:1218)
    at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$33(DriverLocal.scala:1228)
    at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
    at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
    at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$28(DriverLocal.scala:1219)
    at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
    at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:295)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:291)
    at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
    at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:120)
    at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)
    at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:120)
    at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1151)
    at com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1613)
    at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:816)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:1040)
    at scala.util.Try$.apply(Try.scala:213)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:1029)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:1075)
    at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:613)
    at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:636)
    at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
    at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:295)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:291)
    at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
    at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
    at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:81)
    at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96)
    at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77)
    at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:81)
    at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:608)
    at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:517)
    at com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:81)
    at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:1075)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:777)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:870)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:641)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49)
    at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:295)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:291)
    at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47)
    at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44)
    at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:81)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:636)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:559)
    at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:342)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/01/10 18:49:03 INFO JupyterDriverLocal: restart JupyterDriverLocal repl ReplId-19450-b9978-b
25/01/10 18:49:03 ERROR WsfsHttpClient: Failed to get pid namespace id for 69559 with error java.nio.file.NoSuchFileException: /proc/69559/ns/pid
25/01/10 18:49:03 ERROR WsfsHttpClient: Failed to get pid namespace id for 69559 with error java.nio.file.NoSuchFileException: /proc/69559/ns/pid
25/01/10 18:49:04 INFO JupyterDriverLocal: Starting gateway server for repl ReplId-19450-b9978-b
25/01/10 18:49:04 INFO PythonPy4JUtil: Using pinned thread mode in Py4J
25/01/10 18:49:04 INFO IpykernelUtils$: Python process builder: [bash, /local_disk0/.ephemeral_nfs/envs/pythonEnv-f3a0c4e2-8f30-4fc1-8c6c-295c655d85fc/python_start_notebook_scoped.sh, /databricks/spark/python/pyspark/wrapped_python.py, root, /local_disk0/.ephemeral_nfs/envs/pythonEnv-f3a0c4e2-8f30-4fc1-8c6c-295c655d85fc/bin/python, /databricks/python_shell/scripts/db_ipykernel_launcher.py, -f, /databricks/kernel-connections/13a59ac434ad4d83c7b049b615b53fc6b0f9db1e77778881fdd6080e805f7702.json]
25/01/10 18:49:04 INFO IpykernelUtils$: Cgroup isolation disabled, not placing python process with ReplId=ReplId-19450-b9978-b in repl cgroup

 

 

7 REPLIES 7

Walter_C
Databricks Employee
Databricks Employee

Is this issue specifically with this DBR, if you use a lower version of the DBR in the cluster do you see any difference? Have you been able to import this library in the near past?

txti
New Contributor III

I retried with 15.4 LTS for ML and was able to import LanceDB.

Hopefully it is fixed in DBR > 16.1

Thanks,
Manny

Walter_C
Databricks Employee
Databricks Employee

Good to know, let me do some research to see if there is any conflict that can cause this, also 16.2 is upcoming so will be good to check if this provides improvements including this

 

Walter_C
Databricks Employee
Databricks Employee

You have mentioned the error on the driver logs, but what is the specific error mentioned on the cell output when run?

Walter_C
Databricks Employee
Databricks Employee

I get The Python kernel is unresponsive. let me know if it is the same for you

 

victor58
New Contributor II

Hello Walter, i received the same error since 2hours: "The Python kernel is unresponsive."
But i was running a sql action from databricks to postgresql

Walter_C
Databricks Employee
Databricks Employee

Hello @victor58 thanks for your question, on your case lets start with validations based on KB https://kb.databricks.com/en_US/clusters/python-kernel-is-unresponsive-error-message 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group