Hi All,
I have installed the following libraries on my cluster (11.3 LTS that includes Apache Spark 3.3.0, Scala 2.12):
โon executing `from flair.models import TextClassifier`, I get the following error:
"numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject"
I have tried many things including updating numpy, downgrading numpy, changing the cluster specifications but all have failed.
Any help would be greatly appreciated.