cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Data Engineering
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Call scala application jar in notebook

sgarcia
New Contributor II

Hi,

Is there any way to execute jar scala-spark application inside the notebook, without using jobs?

I have different jars for different intakes and I want to call them from a notebook, so I could call them in a parameterized way.

Thanks

4 REPLIES 4

RKNutalapati
Valued Contributor

Hi @Sergio Garcciaā€‹ ,

Try below and let me know if it works.

scala -cp <Your Jar> <Main Class> <arguments>

If you are using job cluster add the jar as dependency

Thanks

sgarcia
New Contributor II

Hi @Rama Krishna Nā€‹ ,

It doesn't work, I think that its not recognizing scala command

Thanks!!

User16763506477
Contributor III

Hi @Sergio Garcciaā€‹, you can attach the jar to the cluster. Then import the package containing the main method and call the main method from the notebook.

jose_gonzalez
Moderator
Moderator

Hi @Sergio Garcciaā€‹ ,

Just a friendly follow-up. Do you still need help? have you check our docs? This might help https://docs.databricks.com/workflows/jobs/jobs.html#jar-jobs-1