Thursday
I am running into the following error while trying to deploy a serverless job running a spark_python_task with GIT as the source for the code. The Job was deployed as part of a DAB from a Github Actions Runner.
Run failed with error message
Library installation failed: Library installation attempted on serverless compute and failed. The library file does not exist or the user does not have permission to read the library file. Please check if the library file exists and the user has the right permissions to access the file. Error code: ERROR_NO_SUCH_FILE_OR_DIRECTORY, error message: Notebook environment installation failed:
ERROR: Could not open requirements file: [Errno 2] No such file or directory: '/tmp/dbx_pipeline/search_model_infra/src/requirements.txt'This is my DAB definition
resources:
jobs:
Search_Infra_Setup_VS_Endpoint_and_Index:
name: "[Search Infra] Setup VS Endpoint and Index"
tasks:
- task_key: run_setup_script
spark_python_task:
python_file: dbx_pipeline/search_model_infra/src/setup_vector_search.py
parameters:
source: GIT
environment_key: default_python
git_source:
git_url: https://github.com/git_repo
git_provider: gitHub
git_branch: develop
queue:
enabled: true
environments:
- environment_key: default_python
spec:
dependencies:
- -r dbx_pipeline/search_model_infra/src/requirements.txt
environment_version: "4"
Thursday
Hey @aav331 , here’s a focused analysis of the community post’s issue and how to fix it.
yaml
resources:
jobs:
search_infra_setup:
name: "[Search Infra] Setup VS Endpoint and Index"
tasks:
- task_key: run_setup_script
spark_python_task:
python_file: ../src/setup_vector_search.py
source: WORKSPACE
environment_key: default_python
libraries:
- requirements: /Workspace/${workspace.file_path}/requirements.txt
environments:
- environment_key: default_python
spec:
environment_version: "4"
In your bundle, ensure the requirements.txt is included (for example via bundle include or workspace files), so it ends up under /Workspace/${workspace.file_path}/requirements.txt at deploy time.Thursday
Hey @aav331 , here’s a focused analysis of the community post’s issue and how to fix it.
yaml
resources:
jobs:
search_infra_setup:
name: "[Search Infra] Setup VS Endpoint and Index"
tasks:
- task_key: run_setup_script
spark_python_task:
python_file: ../src/setup_vector_search.py
source: WORKSPACE
environment_key: default_python
libraries:
- requirements: /Workspace/${workspace.file_path}/requirements.txt
environments:
- environment_key: default_python
spec:
environment_version: "4"
In your bundle, ensure the requirements.txt is included (for example via bundle include or workspace files), so it ends up under /Workspace/${workspace.file_path}/requirements.txt at deploy time.Friday
Thank you @Louis_Frolio ! I used Pattern C and it resolved it for me.
yesterday
@aav331 , if you are happy with the result please "Accept as Solution." This will help others who may be in the same boat. Cheers, Louis.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now