The error indicates a permissions issue when trying to install your custom .whl file from the "Shared" folder in Databricks using dbx and Spark. Specifically, the error [Errno 1] Operation not permitted suggests the process does not have the required permissions to access or install the wheel from that directory.
Core Solution
The main cause is that Databricks clusters typically run with restricted filesystem permissions and your workflow/process may not have access to /Workspace/Shared/Libraries/. Also, the [Poetry] file = "..."] syntax is meant for local development — Databricks expects custom libraries to be installed in certain ways.
What You Should Do
1. Library Installation Location
-
Copy or move your wheel file to the cluster-local temporary directory, such as /local_disk0/tmp/ or Databricks' supported library install locations.
-
Reference that new path during installation in your workflow.
2. Use Workspace Library Feature
-
Upload your .whl file using Databricks' UI under Workspace > Libraries. Add it to your cluster from the Libraries pane, which ensures Databricks manages permissions and installation correctly.
3. Databricks Workflow/Job Library References
-
In your Databricks job or workflow configuration, use the "Library" reference method rather than Poetry’s [file = "..."] syntax — the latter does not work for cluster-wide installations.
4. Poetry Usage Caveat
-
The [tool.poetry.dependencies] my-library = {file = "..."}] directive works for local installs, not for Databricks cluster installs. If you need your workflow to run with this library, either build your wheel on the cluster and install via pip install or use Databricks' built-in methods to attach the wheel.
Example Correct Installation Command
Upload your wheel to /local_disk0/tmp/ and install using:
pip install --upgrade /local_disk0/tmp/my-library-0.2.7-py3-none-any.whl
Alternatively, upload via the Workspace Libraries UI and attach to your cluster/job directly.
References and Further Steps
-
Check or update your permissions for the /Workspace/Shared/Libraries/ directory. If you must use it, confirm your cluster’s service account can read from it.
-
Consider storing and referencing wheel files in cluster-specific temp directories, or using Databricks' recommended methods for uploading custom libraries.
-
For dbx workflows, specify libraries in the job YAML or via UI, not with Poetry’s local file reference.
Fix your workflow by installing your wheel from a cluster-local directory or using Databricks' Workspace library features, rather than directly referencing /Workspace/Shared/Libraries/ in Poetry.