cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

python wheel cannot be installed as library.

vk217
Contributor

When I try to install the python whl library, I get the below error. However I can install it as a jar and it works fine. One difference is that I am creating my own cluster by cloning an existing cluster and copying the whl to a folder called testing in dbfs.

Is it mandatory that the whl should be in dbfs:/FileStore/jars/ folder?

Library installation attempted on the driver node of cluster 0930-154215-fcm3m6pc and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(/databricks/python/bin/pip, install, --upgrade, --find-links=/local_disk0/spark-1032c562-aec3-4228-b221-6a5b507e6b65/userFiles-5ea00ae0-697e-4804-98e4-987fd932aebe, /local_disk0/spark-1032c562-aec3-4228-b221-6a5b507e6b65/userFiles-5ea00ae0-697e-4804-98e4-987fd932aebe/shared_functions-1.0.0.0-py3-none-any.whl, --disable-pip-version-check) exited with code

1. ERROR: shared-functions has an invalid wheel, .dist-info directory 'shared_functions-1.0.0.0.dist-info' does not start with 'shared-functions'

image

1 ACCEPTED SOLUTION

Accepted Solutions

vk217
Contributor

The issue was that the package was renamed after it was installed to the cluster and hence it was not recognized.

View solution in original post

5 REPLIES 5

Sivaprasad1
Valued Contributor II

@Vikas Bā€‹ : What DBR version you are using?

Could you please do

%sh

ls -l /dbfs/FileStore/shared_functions-1.0.0.0-py3-none-any.whl

from a notebook

@Sivaprasad C Sā€‹  This is what I get when I run that command.

-rwxrwxrwx 1 root root 20951 Oct 12 16:51 /dbfs/FileStore/shared_functions-1.0.0.0-py3-none-any.whl

Hubert-Dudek
Esteemed Contributor III

I think you have - (dash) in some places and _ (low dash) in others. Please unify.

Anonymous
Not applicable

Hi @Vikas Bā€‹ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

vk217
Contributor

The issue was that the package was renamed after it was installed to the cluster and hence it was not recognized.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonā€™t want to miss the chance to attend and share knowledge.

If there isnā€™t a group near you, start one and help create a community that brings people together.

Request a New Group