cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Call scala application jar in notebook

sgarcia
New Contributor II

Hi,

Is there any way to execute jar scala-spark application inside the notebook, without using jobs?

I have different jars for different intakes and I want to call them from a notebook, so I could call them in a parameterized way.

Thanks

4 REPLIES 4

RKNutalapati
Valued Contributor

Hi @Sergio Garccia​ ,

Try below and let me know if it works.

scala -cp <Your Jar> <Main Class> <arguments>

If you are using job cluster add the jar as dependency

Thanks

sgarcia
New Contributor II

Hi @Rama Krishna N​ ,

It doesn't work, I think that its not recognizing scala command

Thanks!!

User16763506477
Contributor III

Hi @Sergio Garccia​, you can attach the jar to the cluster. Then import the package containing the main method and call the main method from the notebook.

jose_gonzalez
Moderator
Moderator

Hi @Sergio Garccia​ ,

Just a friendly follow-up. Do you still need help? have you check our docs? This might help https://docs.databricks.com/workflows/jobs/jobs.html#jar-jobs-1

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.