cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Import functions in databricks asset bundles using source: WORKSPACE

Maxrb
New Contributor II

Hi,

We are using Databricks asset bundles, and we create functions which we import in notebooks, for instance:

from utils import helpers

where utils is just a folder in our root. When running this with source: WORKSPACE, it will fail to resolve the import. while when we have source: GIT, it works.

Is there any best practice how we should deal with this? I know that we could build a wheel file but, we have multiple of these folders and using UV I am not sure if it is really possible to include all these folders where we import functions from.

Kind regards

1 REPLY 1

iyashk-DB
Databricks Employee
Databricks Employee

In Git folders, the repo root is auto-added to the Python path, so imports like from utils import helpers work, while in workspace folders, only the notebookโ€™s directory is on the path, which is why it breaks.

The quick fix is a tiny bootstrap that appends your shared folder to sys.path as follows:

import os, sys
# Example: add the project root (adjust as needed)
sys.path.append(os.path.abspath("/Workspace/Users/<your_user>/my_project_root"))
from utils import helpers

and the productionโ€‘grade fix is to package shared code as a Python wheel (built with uv) and attach it in your bundleโ€™s task libraries; store wheels in Workspace Files or Unity Catalog volumes; and you can either put multiple subpackages into one wheel or attach multiple wheels per task depending on how you prefer to split the code.

Ref Doc for this:
1 - https://docs.databricks.com/aws/en/dev-tools/bundles/library-dependencies
2 - https://docs.databricks.com/aws/en/dev-tools/bundles/python-wheel