numpy.ndarray size changed, may indicate binary incompatibility

esi
New Contributor II

Hi All,

I have installed the following libraries on my cluster (11.3 LTS that includes Apache Spark 3.3.0, Scala 2.12):

numpy==1.21.4
flair==0.12
 
on executing `from flair.models import TextClassifier`, I get the following error:
"numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject"

I have tried many things including updating numpy, downgrading numpy, changing the cluster specifications but all have failed. 

Any help would be greatly appreciated.

sean_owen
Databricks Employee
Databricks Employee

You have changed the numpy version, and presumably that is not compatible with other libraries in the runtime. If flair requires later numpy, then use a later DBR runtime for best results, which already has later numpy versions