cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Library installation in cluster taking a long time

AyushModi038
New Contributor III

I am trying to install "pycaret" libraray in cluster using whl file.

But it is creating conflict in the dependency sometimes (not always, sometimes it works too.)

My questions are -

1 - How to install libraries in cluster only single time (Maybe from cache). Because it downloads and install them everytime I start the cluster.

It takes around 20 minutes to install this.

​2 - How to solve the dependency error and why it is not replicated always?

This might be due to change in numpy version because default runtime has 1.21.5 and after the library installation it changes to 1.19.5 (sometimes).

And the error That i get is

"ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject"

Another issue that follows when above gets resolved (suprisingly) is

'ImportError: Numba needs NumPy 1.20 or less' which also get reolved after I re-run the cell.

Can someone please help??

5 REPLIES 5

jose_gonzalez
Moderator
Moderator

Hi,

Which DBR version are you using? are you installing the library using an init script or once the cluster is up and running, you install it? do you see any error message while trying to install the library? check the driver logs.

Hi Jose,

Thanks for the help.

Here are th requested details -

DBR Version - 12.1 ML (includes Apache Spark 3.3.1, Scala 2.12)

Installation Mode - Using cluster UI page -> Libarary Tab -> Install New. It installs the libarary everytime the cluster starts.

Error Messages in Driver Logs -

ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject

It is due to numpy version mismatch

efry
New Contributor II

I'm having the exact same problem and its causing issues when I run workflows, too. Please advise, databricks.

Anonymous
Not applicable

Hi @Ayush Modi​ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future.

Thank you for your participation and let us know if you need any further assistance! 

Spencer_Kent
New Contributor III

Can any Databricks pros provide some guidance on this? My clusters that have "cluster-installed" libraries take 30 minutes or more to become usable. I'm only trying to install a handful of CRAN libraries, but having to re-install them every time a cluster starts up is SO painful. 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!