cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Is there a way to register a scala function that is available to other notebooks?

qwerty1
Contributor

I am in a situation where I have a notebook that runs in a pipeline that creates a "live streaming table". So, I cannot use a language other than sql in the pipeline. I would like to format a certain column in the pipeline using a scala code (it's a complicated formatting and difficult to replicate in SQL).

Spark allows you to register scala methods as udf and access those registered methods in SQL.

But given my current situation (pipeline with DLT), I cannot include the scala method and the statement to register the method in spark context in the notebook.

Is there any work around here?

3 REPLIES 3

-werners-
Esteemed Contributor III

no, DLT does not work with Scala unfortunately.

Delta Live Tables are not vanilla spark.

Is python an option instead of scala?

yes, python is an option if I can use the library https://pypi.org/project/phonenumbers/

-werners-
Esteemed Contributor III

afaik you can create python udf's, but somehow I do not find the docs anymore.

https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-cookbook.html

and

https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-cookbook.html#import-pytho...

But they seem to be removed. If someone knows where to find these...

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!