@Prashant2
You're encountering an important limitation of Databricks DLT pipelines on Serverless Compute: currently, custom Python package installation using pip install within the notebook is not supported in DLT Serverless.
This is not a bug, but rather a limitation by design.
In Serverless DLT, the runtime environment is managed and hardened, and it does not allow arbitrary package installations at notebook execution time for:
- Security
- Stability
- Faster cold start times
While a regular serverless cluster (like for interactive notebooks or jobs) allows pip install,
DLT Serverless pipelines have restricted environments and must use packages that are:
- Pre-installed in the serverless runtime, or
- Packaged via a custom wheel or installed via a requirements.txt at deployment time (not inside the notebook)
LR