02-17-2023 06:26 AM
I am trying to install "pycaret" libraray in cluster using whl file.
But it is creating conflict in the dependency sometimes (not always, sometimes it works too.)
My questions are -
1 - How to install libraries in cluster only single time (Maybe from cache). Because it downloads and install them everytime I start the cluster.
It takes around 20 minutes to install this.
2 - How to solve the dependency error and why it is not replicated always?
This might be due to change in numpy version because default runtime has 1.21.5 and after the library installation it changes to 1.19.5 (sometimes).
And the error That i get is
"ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject"
Another issue that follows when above gets resolved (suprisingly) is
'ImportError: Numba needs NumPy 1.20 or less' which also get reolved after I re-run the cell.
Can someone please help??
02-22-2023 02:02 PM
Hi,
Which DBR version are you using? are you installing the library using an init script or once the cluster is up and running, you install it? do you see any error message while trying to install the library? check the driver logs.
02-27-2023 01:53 AM
Hi Jose,
Thanks for the help.
Here are th requested details -
DBR Version - 12.1 ML (includes Apache Spark 3.3.1, Scala 2.12)
Installation Mode - Using cluster UI page -> Libarary Tab -> Install New. It installs the libarary everytime the cluster starts.
Error Messages in Driver Logs -
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
It is due to numpy version mismatch
10-04-2023 09:59 AM
I'm having the exact same problem and its causing issues when I run workflows, too. Please advise, databricks.
03-10-2023 07:16 PM
Hi @Ayush Modi
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future.
Thank you for your participation and let us know if you need any further assistance!
06-04-2024 10:27 AM
Can any Databricks pros provide some guidance on this? My clusters that have "cluster-installed" libraries take 30 minutes or more to become usable. I'm only trying to install a handful of CRAN libraries, but having to re-install them every time a cluster starts up is SO painful.
06-26-2024 07:03 AM
I am experiencing a similar issue where a few libraries take 15 minutes to install when running a workflow. Could you please advise if there is a solution for this?
07-05-2024 01:08 AM
Hi @AyushModi038 ,
Regarding the subsequent ‘ImportError: Numba needs NumPy 1.20 or less,’ it’s likely related to the same Numpy version discrepancy. Make sure all dependencies align with the desired Numpy version.
If you need further assistance, feel free to ask! 😊🚀
07-05-2024 10:22 AM
@Kaniz_FatmaWhat about question #1, which is what subsequent comments to this thread have been referring to? To recap the question: is it possible for "cluster-installed" libraries to be cached in such a way that they aren't completely reinstalled every time the cluster is started?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group