cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Install bundle built artifact as notebook-scoped library

keb
New Contributor

We are having a hard time finding an intuitive way of using the artifacts we build and deploy with databricks bundle deploy notebook-scoped.

Desired result:
Having internal artifacts be available notebook-scoped for jobs by config
or
Having an easier way of %pip installing internal artifacts

Our setup:
Monorepo of multiple bundles and a package of module of shared functionality

This snippet in all bundles:

artifacts:
  common-lib:
    type: "whl"
    path: "../common-lib"

This ensures that the common-lib is built and included in artifacts/.internal/ for each bundle deployment
Then we specify this for every task that needs it:

libraries:
  - whl: ${workspace.artifact_path}/.internal/common-0.1-py3-none-any.whl

This works, but we found out that this installs the library at the cluster level.
This makes many entries in the libraries tabs for the clusters pointing to internal artifacts to the bundles.

Working "solution":
We have gotten it to work by doing this:

# For each job.yml under resources
parameters:
  - name: common-lib-path
    default: ${workspace.artifact_path}/.internal/common-0.1-py3-none-any.whl
# On top of notebooks that needs the common lib

# Cell 1
lib_path = dbutils.widgets.get("common-lib-path")

# Cell 2
%pip install $lib_path

It works, but it is not a proper solution.

Is there a good way of achieving this we have not found?



0 REPLIES 0

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now