Databricks Serverless: Package import fails from notebook in subfolder after wheel installation

lezwon
Contributor
I have a Python package installed via wheel file in a Databricks serverless environment. The package imports work fine when my notebook is in the root directory, but fail when the notebook is in a subfolder. How can I fix this? src/ ├── datalake_utils/ │ ├── __init__.py │ └── utils.py ├── main.py # This file can import datalake_utils └── relayevents/ └── relayevents.py # This notebook can't import datalake_utils Here is how i install it: %pip install -r /path/to/requirements.txt # If notebook is in src/ from datalake_utils import utils as ut # Works fine # If notebook is in src/relayevents/ from datalake_utils import utils as ut # ModuleNotFoundError ## Environment Databricks Serverless Python package installed via wheel file Package is properly installed (confirmed via pip list)

lezwon
Contributor

src/
├── datalake_utils/
│ ├── __init__.py
│ └── utils.py
├── main.py # This file can import datalake_utils
└── relayevents/
     └── relayevents.py # This notebook can't import datalake_utils

 

 

# If notebook is in src/
from datalake_utils import utils as ut # Works fine

 

# If notebook is in src/relayevents/
from datalake_utils import utils as ut # ModuleNotFoundError

lezwon
Contributor

It appears that there is a pre-installed package called datalake_utils available within Databricks. I had to rename my package to something else, and it worked like a charm.

View solution in original post