Install bundle built artifact as notebook-scoped library
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wednesday - last edited Wednesday
We are having a hard time finding an intuitive way of using the artifacts we build and deploy with databricks bundle deploy notebook-scoped.
Desired result:
Having internal artifacts be available notebook-scoped for jobs by config
or
Having an easier way of %pip installing internal artifacts
Our setup:
Monorepo of multiple bundles and a package of module of shared functionality
This snippet in all bundles:
artifacts:
common-lib:
type: "whl"
path: "../common-lib"
This ensures that the common-lib is built and included in artifacts/.internal/ for each bundle deployment
Then we specify this for every task that needs it:
libraries:
- whl: ${workspace.artifact_path}/.internal/common-0.1-py3-none-any.whl
This works, but we found out that this installs the library at the cluster level.
This makes many entries in the libraries tabs for the clusters pointing to internal artifacts to the bundles.
Working "solution":
We have gotten it to work by doing this:
# For each job.yml under resources
parameters:
- name: common-lib-path
default: ${workspace.artifact_path}/.internal/common-0.1-py3-none-any.whl
# On top of notebooks that needs the common lib
# Cell 1
lib_path = dbutils.widgets.get("common-lib-path")
# Cell 2
%pip install $lib_path
It works, but it is not a proper solution.
Is there a good way of achieving this we have not found?

