cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Feasibility of Dynamically Reusing Common user defined functions Across Multiple DLT Notebooks

aswithap
New Contributor

Hi @DataBricks team,

I'm exploring ways to enable dynamic reusability of common user defined functions across multiple notebooks in a DLT (Delta Live Tables) pipeline. The goal is to avoid duplicating code and maintain a centralized location for commonly used UDFs that can be imported or referenced dynamically.

we tried "import sys" (i mentioned code below) but which is also not supporting.

can you please help me with better way to solve the issue.

please let us know how can we implement a similar approach or an effective pattern to manage and reuse UDFs across notebooks within DLT workflows?

import sys

sys.path.append("file_path.py")

from Functions import get_schema_for_action, map_yaml_to_pyspark_type, load_config

1 REPLY 1

ashraf1395
Honored Contributor

A simple solution and recommedned approach can be - 

If possible you can club all those common user defined functions in a structured python package / whl file.

Now once this whl file is created you can simply upload it to your catalog volume and the from there
you can access it either through pip install it at the top of the dlt pipeline and can import whatever functions you need.
or you can install the library at the cluster level though not sure that when using dlt serverless how to give library to the serverless compute

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now