02-23-2026 05:40 AM - edited 02-23-2026 05:45 AM
Hey there
The setup for getting the error can be very basic:
-Start a runtime (e.g. 17.3 LTS ML with an Standard_NV36ads_A10_v5 [A10] 440 GB memory, 1GPU)
- In a notebook, install the cv2 package like this:
This seems to install correctly
- Then try
This gives the error:
Fatal error: The Python kernel is unresponsive.
The details of the error are:
Fatal error: The Python kernel is unresponsive.
---------------------------------------------------------------------------
The Python process exited with exit code 134 (SIGABRT: Aborted).
The last 10 KB of the process's stderr and stdout can be found below. See driver logs for full logs.
---------------------------------------------------------------------------
Last messages on stderr:
ap
Thread 0x00007f163a30f6c0 (most recent call first):
File "/usr/lib/python3.12/selectors.py", line 415 in select
File "/usr/lib/python3.12/socketserver.py", line 235 in serve_forever
File "/usr/lib/python3.12/threading.py", line 1010 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f163ab106c0 (most recent call first):
File "/databricks/spark/python/lib/py4j-0.10.9.9-src.zip/py4j/java_gateway.py", line 2323 in run
File "/usr/lib/python3.12/threading.py", line 1010 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f163b3116c0 (most recent call first):
File "/databricks/spark/python/lib/py4j-0.10.9.9-src.zip/py4j/clientserver.py", line 58 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f16af7fe6c0 (most recent call first):
File "/databricks/python_shell/lib/dbruntime/supportability/thread_monitor.py", line 44 in log_thread_stack_trace
File "/usr/lib/python3.12/threading.py", line 1010 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f16affff6c0 (most recent call first):
File "/usr/lib/python3.12/selectors.py", line 468 in select
File "/usr/lib/python3.12/asyncio/base_events.py", line 1949 in _run_once
File "/usr/lib/python3.12/asyncio/base_events.py", line 641 in run_forever
File "/databricks/python/lib/python3.12/site-packages/tornado/platform/asyncio.py", line 205 in start
File "/databricks/python/lib/python3.12/site-packages/ipykernel/control.py", line 23 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f16c49706c0 (most recent call first):
File "/databricks/python/lib/python3.12/site-packages/ipykernel/iostream.py", line 387 in _watch_pipe_fd
File "/usr/lib/python3.12/threading.py", line 1010 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f16c51716c0 (most recent call first):
File "/databricks/python/lib/python3.12/site-packages/ipykernel/iostream.py", line 387 in _watch_pipe_fd
File "/usr/lib/python3.12/threading.py", line 1010 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f16c69746c0 (most recent call first):
File "/databricks/python/lib/python3.12/site-packages/ipykernel/heartbeat.py", line 106 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00007f16c71756c0 (most recent call first):
File "/usr/lib/python3.12/selectors.py", line 468 in select
File "/usr/lib/python3.12/asyncio/base_events.py", line 1949 in _run_once
File "/usr/lib/python3.12/asyncio/base_events.py", line 641 in run_forever
File "/databricks/python/lib/python3.12/site-packages/tornado/platform/asyncio.py", line 205 in start
File "/databricks/python/lib/python3.12/site-packages/ipykernel/iostream.py", line 92 in _thread_main
File "/usr/lib/python3.12/threading.py", line 1010 in run
File "/usr/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "/usr/lib/python3.12/threading.py", line 1030 in _bootstrap
Current thread 0x00007f16cb3e3080 (most recent call first):
File "<frozen importlib._bootstrap>", line 488 in _call_with_frames_removed
File "<frozen importlib._bootstrap_external>", line 1289 in create_module
File "<frozen importlib._bootstrap>", line 813 in module_from_spec
File "<frozen importlib._bootstrap>", line 921 in _load_unlocked
File "<frozen importlib._bootstrap>", line 1331 in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 1360 in _find_and_load
File "<frozen importlib._bootstrap>", line 1387 in _gcd_import
File "/usr/lib/python3.12/importlib/__init__.py", line 90 in import_module
File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-8567f863-5923-41ca-8571-2c1c2af0dd5b/lib/python3.12/site-packages/cv2/__init__.py", line 153 in bootstrap
File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-8567f863-5923-41ca-8571-2c1c2af0dd5b/lib/python3.12/site-packages/cv2/__init__.py", line 181 in <module>
File "<frozen importlib._bootstrap>", line 488 in _call_with_frames_removed
File "<frozen importlib._bootstrap_external>", line 995 in exec_module
File "/databricks/python_shell/lib/dbruntime/PostImportHook.py", line 243 in patched_exec_module
File "<frozen importlib._bootstrap>", line 935 in _load_unlocked
File "<frozen importlib._bootstrap>", line 1331 in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 1360 in _find_and_load
File "/databricks/python_shell/lib/dbruntime/autoreload/discoverability/autoreload_discoverability_hook.py", line 98 in _patched_import
File "/root/.ipykernel/2644/command-6818639807560775-571303353", line 1 in <module>
File "/databricks/python/lib/python3.12/site-packages/IPython/core/interactiveshell.py", line 3577 in run_code
File "/databricks/python/lib/python3.12/site-packages/IPython/core/interactiveshell.py", line 3517 in run_ast_nodes
File "/databricks/python/lib/python3.12/site-packages/IPython/core/interactiveshell.py", line 3334 in run_cell_async
File "/databricks/python/lib/python3.12/site-packages/IPython/core/async_helpers.py", line 128 in _pseudo_sync_runner
File "/databricks/python/lib/python3.12/site-packages/IPython/core/interactiveshell.py", line 3130 in _run_cell
File "/databricks/python/lib/python3.12/site-packages/IPython/core/interactiveshell.py", line 3075 in run_cell
File "/databricks/python/lib/python3.12/site-packages/ipykernel/zmqshell.py", line 549 in run_cell
File "/databricks/python/lib/python3.12/site-packages/ipykernel/ipkernel.py", line 449 in do_execute
File "/databricks/python_shell/lib/dbruntime/kernel.py", line 558 in do_execute
File "/databricks/python/lib/python3.12/site-packages/ipykernel/kernelbase.py", line 778 in execute_request
File "/databricks/python/lib/python3.12/site-packages/ipykernel/ipkernel.py", line 362 in execute_request
File "/databricks/python/lib/python3.12/site-packages/ipykernel/kernelbase.py", line 437 in dispatch_shell
File "/databricks/python/lib/python3.12/site-packages/ipykernel/kernelbase.py", line 534 in process_one
File "/databricks/python/lib/python3.12/site-packages/ipykernel/kernelbase.py", line 545 in dispatch_queue
File "/usr/lib/python3.12/asyncio/events.py", line 88 in _run
File "/usr/lib/python3.12/asyncio/base_events.py", line 1987 in _run_once
File "/usr/lib/python3.12/asyncio/base_events.py", line 641 in run_forever
File "/databricks/python/lib/python3.12/site-packages/tornado/platform/asyncio.py", line 205 in start
File "/databricks/python/lib/python3.12/site-packages/ipykernel/kernelapp.py", line 739 in start
File "/databricks/python/lib/python3.12/site-packages/traitlets/config/application.py", line 1075 in launch_instance
File "/databricks/python_shell/scripts/db_ipykernel_launcher.py", line 48 in main
File "/databricks/python_shell/scripts/db_ipykernel_launcher.py", line 52 in <module>
Extension modules: zmq.backend.cython._zmq, tornado.speedups, psutil._psutil_linux, psutil._psutil_posix, numpy._core._multiarray_umath, numpy.linalg._umath_linalg, pyarrow.lib, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pyarrow._compute, pandas._libs.ops, pandas._libs.hashing, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.internals, pandas._libs.indexing, pandas._libs.index, pandas._libs.writers, pandas._libs.join, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, grpc._cython.cygrpc, google._upb._message, _pydevd_bundle.pydevd_cython, _pydevd_sys_monitoring_cython, _pydevd_sys_monitoring._pydevd_sys_monitoring_cython, _brotli, simplejson._speedups, charset_normalizer.md, yaml._yaml, pyarrow._fs, pyarrow._azurefs, pyarrow._hdfs, pyarrow._gcsfs, pyarrow._s3fs, cython.cimports.libc.math, scipy._lib._ccallback_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._decomp_lu_cython, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg._matfuncs_expm, scipy.linalg._linalg_pythran, scipy.linalg.cython_blas, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.linalg._propack._spropack, scipy.sparse.linalg._propack._dpropack, scipy.sparse.linalg._propack._cpropack, scipy.sparse.linalg._propack._zpropack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, ujson (total: 101)
---------------------------------------------------------------------------
Last messages on stdout:
NOTE: When using the `ipython kernel` entry point, Ctrl-C will not work.
To exit, you will have to explicitly quit this process, by either sending
"quit" from a client, or using Ctrl-\ in UNIX-like environments.
To read more about this, see https://github.com/ipython/ipython/issues/2049" target="_blank" rel="noopener noreferrer">https://github.com/ipython/ipython/issues/2049</a></span>
To connect another client to this kernel, use:
--existing /databricks/kernel-connections/3548c4a186efe1f254dfff7fc25643a60e70bb786cda5765bc2aa4268bfa8b53.json
Collecting opencv-python
Downloading opencv_python-4.13.0.92-cp37-abi3-manylinux_2_28_x86_64.whl.metadata (19 kB)
Requirement already satisfied: numpy>=2 in /databricks/python3/lib/python3.12/site-packages (from opencv-python) (2.1.3)
Downloading opencv_python-4.13.0.92-cp37-abi3-manylinux_2_28_x86_64.whl (72.9 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/72.9 MB ? eta -:--:--
━━━━━━━━━━━━━━━━╸━━━━━━━━━━━━━━━━━━━━━━━ 30.4/72.9 MB 164.4 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸━━━━━ 62.9/72.9 MB 161.1 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 72.9/72.9 MB 161.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 72.9/72.9 MB 161.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 72.9/72.9 MB 161.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 72.9/72.9 MB 161.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 72.9/72.9 MB 161.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 72.9/72.9 MB 161.6 MB/s eta 0:00:01
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 72.9/72.9 MB 43.3 MB/s eta 0:00:00
Installing collected packages: opencv-python
Successfully installed opencv-python-4.13.0.92
Note: you may need to restart the kernel using %restart_python or dbutils.library.restartPython() to use updated packages.
Strangely enough, this setup was working fine a few weeks ago.
If anyone has the same issue or knows a workaround, I would be delighted to learn about it.
Best,
Boudewijn
4 weeks ago
Hi @Bodevan,
This is a common scenario when installing the standard opencv-python package on Databricks (or any headless server environment). The root cause is that opencv-python ships with GUI dependencies (Qt and X11 libraries) that are not available on Databricks cluster nodes, since they are headless Linux servers with no display. When Python tries to load the cv2 module, it attempts to link against those missing shared libraries, which causes the process to abort with SIGABRT (exit code 134) -- exactly the crash you are seeing.
You mentioned this was working a few weeks ago. This is likely because the opencv-python 4.13.0.92 release (which your logs show being downloaded) may have changed how it links to GUI dependencies compared to earlier versions. Either way, the solution below will resolve it reliably going forward.
THE FIX
Use opencv-python-headless instead of opencv-python. This package provides the exact same OpenCV functionality (all the same cv2 functions for image reading, processing, transformations, model inference, etc.) but without the GUI components that cause the crash. You do not need cv2.imshow() or other GUI display functions on a Databricks cluster anyway, since there is no display server.
In your notebook, run the following:
%pip install opencv-python-headless
Then in the next cell:
import cv2
print(cv2.__version__)
That should work without any crash.
IMPORTANT: UNINSTALL THE CONFLICTING PACKAGE FIRST
If you already ran %pip install opencv-python in your current session, the conflicting package may still be cached. To be safe, uninstall it first:
%pip uninstall opencv-python -y
%pip install opencv-python-headless
Then restart the Python environment to clear any cached module state:
dbutils.library.restartPython()
After that, import cv2 should work cleanly.
IF YOU NEED OPENCV CONTRIB MODULES
If your workload requires extra OpenCV contrib modules (such as SIFT, SURF, or other extended algorithms), use the headless contrib variant instead:
%pip install opencv-contrib-python-headless
This gives you the full contrib module set without the GUI dependencies.
MAKING IT PERSISTENT
If you want opencv available on every cluster start without re-running %pip each time, you have a few options:
1. Cluster library: Go to your cluster configuration, click Libraries, and install opencv-python-headless as a PyPI package. This installs it for all notebooks on that cluster.
2. Init script: Create a cluster-scoped init script that runs pip install opencv-python-headless at node startup. This is useful if you need it on all nodes for distributed UDF workloads.
3. Requirements file: If you use a requirements.txt for your project, add opencv-python-headless there and install via %pip install -r requirements.txt.
WHY THIS HAPPENS
The four OpenCV PyPI packages are:
- opencv-python (main modules + GUI) -- crashes on headless servers
- opencv-python-headless (main modules, no GUI) -- works on Databricks
- opencv-contrib-python (main + contrib + GUI) -- crashes on headless servers
- opencv-contrib-python-headless (main + contrib, no GUI) -- works on Databricks
These packages all provide the same "cv2" module name and conflict with each other, so only one should be installed at a time. The headless variants are designed specifically for server and cloud environments like Databricks.
REFERENCES
- Databricks documentation on notebook-scoped Python libraries: https://docs.databricks.com/en/libraries/notebooks-python-libraries.html
- Databricks documentation on cluster libraries: https://docs.databricks.com/en/libraries/cluster-libraries.html
- OpenCV Python package documentation explaining headless variants: https://pypi.org/project/opencv-python-headless/
- Databricks Runtime 17.3 LTS ML release notes (your runtime): https://docs.databricks.com/en/release-notes/runtime/17.3lts-ml.html
Hope this gets you unblocked. Let us know if you run into anything else.
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.
4 weeks ago
Hi @Bodevan,
This is a common scenario when installing the standard opencv-python package on Databricks (or any headless server environment). The root cause is that opencv-python ships with GUI dependencies (Qt and X11 libraries) that are not available on Databricks cluster nodes, since they are headless Linux servers with no display. When Python tries to load the cv2 module, it attempts to link against those missing shared libraries, which causes the process to abort with SIGABRT (exit code 134) -- exactly the crash you are seeing.
You mentioned this was working a few weeks ago. This is likely because the opencv-python 4.13.0.92 release (which your logs show being downloaded) may have changed how it links to GUI dependencies compared to earlier versions. Either way, the solution below will resolve it reliably going forward.
THE FIX
Use opencv-python-headless instead of opencv-python. This package provides the exact same OpenCV functionality (all the same cv2 functions for image reading, processing, transformations, model inference, etc.) but without the GUI components that cause the crash. You do not need cv2.imshow() or other GUI display functions on a Databricks cluster anyway, since there is no display server.
In your notebook, run the following:
%pip install opencv-python-headless
Then in the next cell:
import cv2
print(cv2.__version__)
That should work without any crash.
IMPORTANT: UNINSTALL THE CONFLICTING PACKAGE FIRST
If you already ran %pip install opencv-python in your current session, the conflicting package may still be cached. To be safe, uninstall it first:
%pip uninstall opencv-python -y
%pip install opencv-python-headless
Then restart the Python environment to clear any cached module state:
dbutils.library.restartPython()
After that, import cv2 should work cleanly.
IF YOU NEED OPENCV CONTRIB MODULES
If your workload requires extra OpenCV contrib modules (such as SIFT, SURF, or other extended algorithms), use the headless contrib variant instead:
%pip install opencv-contrib-python-headless
This gives you the full contrib module set without the GUI dependencies.
MAKING IT PERSISTENT
If you want opencv available on every cluster start without re-running %pip each time, you have a few options:
1. Cluster library: Go to your cluster configuration, click Libraries, and install opencv-python-headless as a PyPI package. This installs it for all notebooks on that cluster.
2. Init script: Create a cluster-scoped init script that runs pip install opencv-python-headless at node startup. This is useful if you need it on all nodes for distributed UDF workloads.
3. Requirements file: If you use a requirements.txt for your project, add opencv-python-headless there and install via %pip install -r requirements.txt.
WHY THIS HAPPENS
The four OpenCV PyPI packages are:
- opencv-python (main modules + GUI) -- crashes on headless servers
- opencv-python-headless (main modules, no GUI) -- works on Databricks
- opencv-contrib-python (main + contrib + GUI) -- crashes on headless servers
- opencv-contrib-python-headless (main + contrib, no GUI) -- works on Databricks
These packages all provide the same "cv2" module name and conflict with each other, so only one should be installed at a time. The headless variants are designed specifically for server and cloud environments like Databricks.
REFERENCES
- Databricks documentation on notebook-scoped Python libraries: https://docs.databricks.com/en/libraries/notebooks-python-libraries.html
- Databricks documentation on cluster libraries: https://docs.databricks.com/en/libraries/cluster-libraries.html
- OpenCV Python package documentation explaining headless variants: https://pypi.org/project/opencv-python-headless/
- Databricks Runtime 17.3 LTS ML release notes (your runtime): https://docs.databricks.com/en/release-notes/runtime/17.3lts-ml.html
Hope this gets you unblocked. Let us know if you run into anything else.
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.