ā10-12-2022 01:08 PM
When I try to install the python whl library, I get the below error. However I can install it as a jar and it works fine. One difference is that I am creating my own cluster by cloning an existing cluster and copying the whl to a folder called testing in dbfs.
Is it mandatory that the whl should be in dbfs:/FileStore/jars/ folder?
Library installation attempted on the driver node of cluster 0930-154215-fcm3m6pc and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(/databricks/python/bin/pip, install, --upgrade, --find-links=/local_disk0/spark-1032c562-aec3-4228-b221-6a5b507e6b65/userFiles-5ea00ae0-697e-4804-98e4-987fd932aebe, /local_disk0/spark-1032c562-aec3-4228-b221-6a5b507e6b65/userFiles-5ea00ae0-697e-4804-98e4-987fd932aebe/shared_functions-1.0.0.0-py3-none-any.whl, --disable-pip-version-check) exited with code
1. ERROR: shared-functions has an invalid wheel, .dist-info directory 'shared_functions-1.0.0.0.dist-info' does not start with 'shared-functions'
ā12-19-2022 05:52 AM
The issue was that the package was renamed after it was installed to the cluster and hence it was not recognized.
ā10-13-2022 05:51 AM
@Vikas Bā : What DBR version you are using?
Could you please do
%sh
ls -l /dbfs/FileStore/shared_functions-1.0.0.0-py3-none-any.whl
from a notebook
ā10-13-2022 08:02 AM
@Sivaprasad C Sā This is what I get when I run that command.
-rwxrwxrwx 1 root root 20951 Oct 12 16:51 /dbfs/FileStore/shared_functions-1.0.0.0-py3-none-any.whl
ā10-16-2022 12:02 PM
I think you have - (dash) in some places and _ (low dash) in others. Please unify.
ā11-19-2022 10:49 PM
Hi @Vikas Bā
Hope all is well!
Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
ā12-19-2022 05:52 AM
The issue was that the package was renamed after it was installed to the cluster and hence it was not recognized.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonāt want to miss the chance to attend and share knowledge.
If there isnāt a group near you, start one and help create a community that brings people together.
Request a New Group