Call scala application jar in notebook
Hi,Is there any way to execute jar scala-spark application inside the notebook, without using jobs?I have different jars for different intakes and I want to call them from a notebook, so I could call them in a parameterized way.Thanks
- 4720 Views
 - 4 replies
 - 1 kudos
 
  Latest Reply  
               Hi @Sergio Garccia​ ,Just a friendly follow-up. Do you still need help? have you check our docs? This might help https://docs.databricks.com/workflows/jobs/jobs.html#jar-jobs-1
- 1 kudos