cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Library installation in cluster taking a long time

AyushModi038
New Contributor III

I am trying to install "pycaret" libraray in cluster using whl file.

But it is creating conflict in the dependency sometimes (not always, sometimes it works too.)

My questions are -

1 - How to install libraries in cluster only single time (Maybe from cache). Because it downloads and install them everytime I start the cluster.

It takes around 20 minutes to install this.

​2 - How to solve the dependency error and why it is not replicated always?

This might be due to change in numpy version because default runtime has 1.21.5 and after the library installation it changes to 1.19.5 (sometimes).

And the error That i get is

"ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject"

Another issue that follows when above gets resolved (suprisingly) is

'ImportError: Numba needs NumPy 1.20 or less' which also get reolved after I re-run the cell.

Can someone please help??

8 REPLIES 8

jose_gonzalez
Moderator
Moderator

Hi,

Which DBR version are you using? are you installing the library using an init script or once the cluster is up and running, you install it? do you see any error message while trying to install the library? check the driver logs.

Hi Jose,

Thanks for the help.

Here are th requested details -

DBR Version - 12.1 ML (includes Apache Spark 3.3.1, Scala 2.12)

Installation Mode - Using cluster UI page -> Libarary Tab -> Install New. It installs the libarary everytime the cluster starts.

Error Messages in Driver Logs -

ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject

It is due to numpy version mismatch

efry
New Contributor II

I'm having the exact same problem and its causing issues when I run workflows, too. Please advise, databricks.

Anonymous
Not applicable

Hi @Ayush Modi​ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future.

Thank you for your participation and let us know if you need any further assistance! 

Spencer_Kent
New Contributor III

Can any Databricks pros provide some guidance on this? My clusters that have "cluster-installed" libraries take 30 minutes or more to become usable. I'm only trying to install a handful of CRAN libraries, but having to re-install them every time a cluster starts up is SO painful. 

shirlyb-melio
New Contributor II

I am experiencing a similar issue where a few libraries take 15 minutes to install when running a workflow. Could you please advise if there is a solution for this?

Hi @AyushModi038  , 

  • Ensure that the library you’re installing is compatible with the Numpy version in your cluster.
  • Check if there are any other libraries causing conflicts.
  • Consider specifying the Numpy version explicitly in your requirements file.
  • If the issue persists, try using a different cluster configuration or runtime version.

Regarding the subsequent ‘ImportError: Numba needs NumPy 1.20 or less,’ it’s likely related to the same Numpy version discrepancy. Make sure all dependencies align with the desired Numpy version.

If you need further assistance, feel free to ask! 😊🚀

Spencer_Kent
New Contributor III

@Kaniz_FatmaWhat about question #1, which is what subsequent comments to this thread have been referring to? To recap the question: is it possible for "cluster-installed" libraries to be cached in such a way that they aren't completely reinstalled every time the cluster is started?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group