Feasibility of Dynamically Reusing Common user defined functions Across Multiple DLT Notebooks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
Hi @DataBricks team,
I'm exploring ways to enable dynamic reusability of common user defined functions across multiple notebooks in a DLT (Delta Live Tables) pipeline. The goal is to avoid duplicating code and maintain a centralized location for commonly used UDFs that can be imported or referenced dynamically.
we tried "import sys" (i mentioned code below) but which is also not supporting.
can you please help me with better way to solve the issue.
please let us know how can we implement a similar approach or an effective pattern to manage and reuse UDFs across notebooks within DLT workflows?
import sys
sys.path.append("file_path.py")
from Functions import get_schema_for_action, map_yaml_to_pyspark_type, load_config
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
A simple solution and recommedned approach can be -
If possible you can club all those common user defined functions in a structured python package / whl file.
Now once this whl file is created you can simply upload it to your catalog volume and the from there
you can access it either through pip install it at the top of the dlt pipeline and can import whatever functions you need.
or you can install the library at the cluster level though not sure that when using dlt serverless how to give library to the serverless compute

