numpy.ndarray size changed, may indicate binary incompatibility
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2023 08:23 AM
Hi All,
I have installed the following libraries on my cluster (11.3 LTS that includes Apache Spark 3.3.0, Scala 2.12):
numpy==1.21.4
flair==0.12
on executing `from flair.models import TextClassifier`, I get the following error:
"numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject"
I have tried many things including updating numpy, downgrading numpy, changing the cluster specifications but all have failed.
Any help would be greatly appreciated.
"numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject"
I have tried many things including updating numpy, downgrading numpy, changing the cluster specifications but all have failed.
Any help would be greatly appreciated.
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-04-2023 08:17 AM
You have changed the numpy version, and presumably that is not compatible with other libraries in the runtime. If flair requires later numpy, then use a later DBR runtime for best results, which already has later numpy versions
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2024 08:51 PM
We also get similar error for numpy 2.0