I'm encountering an issue with the installation of Python packages from a Private PyPI mirror, specifically when the package contains dependencies and the installation is on Databricks clusters - Cluster libraries | Databricks on AWS. Initially, everything worked smoothly, with packages being installed and executed as expected - no dependencies. However, as my package evolved and a more complex version was deployed to Artifactory, which includes dependencies specified in the install_requires parameter within setup.py of the package, the installation fails. The package dependencies from Public PyPi are not being resolved, resulting in errors like the following:
ERROR: Could not find a version that satisfies the requirement package_x==1.2.3 (from versions: none).
It seems that the installation process in the cluster might be using the parameter index-url instead of extra-index-url. Interestingly, in a notebook context - Notebook-scoped Python libraries | Databricks on AWS, when installing the same package with extra-index-url, the installation proceeds without any issues.
This inconsistency is proving to be quite challenging, particularly as projects become more complex and reliant on external dependencies.
I'm reaching out to the community for any insights or assistance in resolving this matter. If anyone has encountered a similar issue or has suggestions for potential workarounds, I would greatly appreciate your input.