Call scala application jar in notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-13-2022 02:58 AM
Hi,
Is there any way to execute jar scala-spark application inside the notebook, without using jobs?
I have different jars for different intakes and I want to call them from a notebook, so I could call them in a parameterized way.
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-13-2022 04:16 AM
Hi @Sergio Garcciaโ ,
Try below and let me know if it works.
scala -cp <Your Jar> <Main Class> <arguments>
If you are using job cluster add the jar as dependency
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-13-2022 04:29 AM
Hi @Rama Krishna Nโ ,
It doesn't work, I think that its not recognizing scala command
Thanks!!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-19-2022 10:22 PM
Hi @Sergio Garcciaโ, you can attach the jar to the cluster. Then import the package containing the main method and call the main method from the notebook.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ10-03-2022 09:25 AM
Hi @Sergio Garcciaโ ,
Just a friendly follow-up. Do you still need help? have you check our docs? This might help https://docs.databricks.com/workflows/jobs/jobs.html#jar-jobs-1

