05-01-2025 11:04 PM
Hello guys!
I am getting this error when running a job:
ERROR: Could not install packages due to an OSError: [Errno 13] Permission denied: '/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/some-python-package'
I have listed a wheel file as a dependency in the job environment and running a python script.
I have seen this error before, when I was trying to install a python package via the databricks terminal. That was when I got to know that Databricks doesn't allow you to install packages from terminal.
But this error shouldn't happen when running a job, right? In the job, you have no way of installing packages from anywhere else but the designated job environment.
Please guide me through how to resolve it and get my jobs running again.
05-06-2025 06:56 AM
Hey @anmol-aidora ,
The error you’re seeing happens because the job is trying to install a package into a system directory (/local_disk0/...) where it doesn’t have the necessary permissions.
The correct way to install a .whl file for a job is by using the Libraries system in Databricks, and this should be done when configuring the cluster that runs the job.
When setting up the cluster (either directly or through the job configuration), you can attach a library from several sources:
DBFS
S3
Maven or PyPI
Workspace
And different Library types: .jar, .zip, . tar, .whl, etc.
Once the library is attached to the cluster, Databricks will ensure it is installed before your job starts, and it will be available during execution without requiring any manual installation from code.
In short: do not install packages inside the job script. Use the job or cluster configuration to attach the .whl as a library. That will resolve the permission issue and ensure your environment is ready.
Hope this helps!, 🙂
Isi
05-07-2025 04:28 AM
Hi @anmol-aidora ,
Thanks for sharing the screenshot! Just to clarify, the compute you’re showing is a SQL Warehouse, which only supports SQL. That means you can’t install Python libraries or run Python code there, since it doesn’t include a Python runtime.
This comes up often, so I just wanted to point it out in case there’s any confusion 🙂. To test and install .whl packages, you’ll need to use an interactive cluster (All-purpouse compute), which supports Python, SQL, and Scala.
Also, just to clarify: SQL Warehouses are considered Classic, Pro or Serverless, but interactive clusters are not, they must be created and managed separately.
Regards,
Isi
05-06-2025 06:56 AM
Hey @anmol-aidora ,
The error you’re seeing happens because the job is trying to install a package into a system directory (/local_disk0/...) where it doesn’t have the necessary permissions.
The correct way to install a .whl file for a job is by using the Libraries system in Databricks, and this should be done when configuring the cluster that runs the job.
When setting up the cluster (either directly or through the job configuration), you can attach a library from several sources:
DBFS
S3
Maven or PyPI
Workspace
And different Library types: .jar, .zip, . tar, .whl, etc.
Once the library is attached to the cluster, Databricks will ensure it is installed before your job starts, and it will be available during execution without requiring any manual installation from code.
In short: do not install packages inside the job script. Use the job or cluster configuration to attach the .whl as a library. That will resolve the permission issue and ensure your environment is ready.
Hope this helps!, 🙂
Isi
05-06-2025 07:19 AM - edited 05-06-2025 07:22 AM
Hi Isi, I am not trying to install the `.whl` file in the notebook, in fact I am not using a notebook.
Here is my anonymised job script:
resources:
jobs:
download_job_1234:
name: download_job
tasks:
- task_key: download_raw_files
spark_python_task:
python_file: /Workspace/Users/abcd/download_raw_files.py
min_retry_interval_millis: 900000
disable_auto_optimization: false
environment_key: download_raw_files_env
queue:
enabled: true
environments:
- environment_key:download_raw_files_env
spec:
client: "2"
dependencies:
- /Volumes/workspace/default/dists/abcd.whl
Thank you for pointing out the `Libraries` feature. I will look for it in the UI, as it's not listed in my UI. Is it available in databricks-serverless?
05-06-2025 07:51 AM
Hey @anmol-aidora
The Libraries section in the UI is available under Compute > select your cluster > Libraries. To make sure your .whl file is accessible and valid, you could try attaching it manually to an interactive cluster using the UI, just like I showed in my previous message.
Try attaching the .whl to another cluster and see if it installs correctly. If it works, at least we know that the file is accessible and the package installs properly. I’ve seen some issues in the past where jobs couldn’t access volumes properly, even though the file existed.
As an alternative, you could translate the volume path into a direct bucket or container path. Since a volume (external volumen) is essentially just a layer over your object storage managed via Unity Catalog, you can instead point to the underlying file system path.
Hope this helps, 🙂
Isi
05-06-2025 01:46 PM
Hi Isi,
I don't have any clusters available for which I can set up the Libraries, I am using databricks serverless
05-07-2025 04:28 AM
Hi @anmol-aidora ,
Thanks for sharing the screenshot! Just to clarify, the compute you’re showing is a SQL Warehouse, which only supports SQL. That means you can’t install Python libraries or run Python code there, since it doesn’t include a Python runtime.
This comes up often, so I just wanted to point it out in case there’s any confusion 🙂. To test and install .whl packages, you’ll need to use an interactive cluster (All-purpouse compute), which supports Python, SQL, and Scala.
Also, just to clarify: SQL Warehouses are considered Classic, Pro or Serverless, but interactive clusters are not, they must be created and managed separately.
Regards,
Isi
05-07-2025 06:14 AM
Thanks for clarifying Isi, really appreciate it
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now