Install bundle built artifact as notebook-scoped library
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-26-2025 12:28 PM - edited 03-26-2025 12:31 PM
We are having a hard time finding an intuitive way of using the artifacts we build and deploy with databricks bundle deploy notebook-scoped.
Desired result:
Having internal artifacts be available notebook-scoped for jobs by config
or
Having an easier way of %pip installing internal artifacts
Our setup:
Monorepo of multiple bundles and a package of module of shared functionality
This snippet in all bundles:
artifacts:
common-lib:
type: "whl"
path: "../common-lib"This ensures that the common-lib is built and included in artifacts/.internal/ for each bundle deployment
Then we specify this for every task that needs it:
libraries:
- whl: ${workspace.artifact_path}/.internal/common-0.1-py3-none-any.whlThis works, but we found out that this installs the library at the cluster level.
This makes many entries in the libraries tabs for the clusters pointing to internal artifacts to the bundles.
Working "solution":
We have gotten it to work by doing this:
# For each job.yml under resources
parameters:
- name: common-lib-path
default: ${workspace.artifact_path}/.internal/common-0.1-py3-none-any.whl# On top of notebooks that needs the common lib
# Cell 1
lib_path = dbutils.widgets.get("common-lib-path")
# Cell 2
%pip install $lib_pathIt works, but it is not a proper solution.
Is there a good way of achieving this we have not found?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-08-2025 12:49 AM
We were not able to find a clean solution for this, so what we ended up doing is referencing the common lib like this in every notebook it is needed.
%pip install ../../../artifacts/.internal/common-0.1-py3-none-any.whl