cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

numpy.ndarray size changed, may indicate binary incompatibility

esi
New Contributor

Hi All,

I have installed the following libraries on my cluster (11.3 LTS that includes Apache Spark 3.3.0, Scala 2.12):

numpy==1.21.4
flair==0.12
 
on executing `from flair.models import TextClassifier`, I get the following error:
"numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject"

I have tried many things including updating numpy, downgrading numpy, changing the cluster specifications but all have failed. 

Any help would be greatly appreciated.
2 REPLIES 2

sean_owen
Databricks Employee
Databricks Employee

You have changed the numpy version, and presumably that is not compatible with other libraries in the runtime. If flair requires later numpy, then use a later DBR runtime for best results, which already has later numpy versions

Eddy123
New Contributor II