cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

swatish0395
by New Contributor III
  • 3260 Views
  • 3 replies
  • 4 kudos

Resolved! how to create a scala jar using db notebook and save it in a file path inside databricks

I have scala function as below, i am unable to understand how to write a scala jar with the same, please find below code i have used Enforcing Column-Level Encryption - Databrick %scala import com.macasaet.fernet.{Key, StringValidator, Token}import o...

  • 3260 Views
  • 3 replies
  • 4 kudos
Latest Reply
swatish0395
New Contributor III
  • 4 kudos

I had to finally create the jar using teh intellij and sbt iconfiguration on the same env. and then installed the jar in the cluster it worked

  • 4 kudos
2 More Replies
qwerty1
by Contributor
  • 6576 Views
  • 3 replies
  • 1 kudos

Is there a way to register a scala function that is available to other notebooks?

I am in a situation where I have a notebook that runs in a pipeline that creates a "live streaming table". So, I cannot use a language other than sql in the pipeline. I would like to format a certain column in the pipeline using a scala code (it's a ...

  • 6576 Views
  • 3 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

no, DLT does not work with Scala unfortunately.Delta Live Tables are not vanilla spark.Is python an option instead of scala?

  • 1 kudos
2 More Replies
TS
by New Contributor III
  • 3693 Views
  • 3 replies
  • 3 kudos

Resolved! Turn spark.sql query into scala function

Hello,I'm learning Scala / Spark and try to understand what's wrong with my function:I have a spark.sql query, stored in a variable:val uViewName = spark.sql(""" SELECT v.Data_View_Name FROM apoHierarchy AS h INNER JOIN apoView AS v ON h.View_N...

  • 3693 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

try add .first()(0) it will return only value from first row/column as currently you are returning Dataset: var uViewName = spark.sql(s""" SELECT v.Data_View_Name FROM apoHierarchy AS h INNER JOIN apoView AS v ON h.View_Name = v.Context_View_N...

  • 3 kudos
2 More Replies
Labels