- 2219 Views
- 4 replies
- 1 kudos
Hi,Is there any way to execute jar scala-spark application inside the notebook, without using jobs?I have different jars for different intakes and I want to call them from a notebook, so I could call them in a parameterized way.Thanks
- 2219 Views
- 4 replies
- 1 kudos
Latest Reply
Hi @Sergio Garccia​ ,Just a friendly follow-up. Do you still need help? have you check our docs? This might help https://docs.databricks.com/workflows/jobs/jobs.html#jar-jobs-1
3 More Replies
by
Nosa
• New Contributor II
- 1126 Views
- 3 replies
- 4 kudos
I am developing an application. I want to use databricks in my application. I developed that with python and godot. how can I have data bricks in my application?
- 1126 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @Ensiyeh Shojaei​ ,Which cloud service are you using? According to the cloud provider, you will have a list of tools that can help you to connect and interact in your application.
2 More Replies
- 4442 Views
- 4 replies
- 2 kudos
Hello, I am trying to host my application on Databricks and I want to expose rest APIs of my application to be accessed from postman but I am unable to find any documentation on how to do this. I tried to write simple flask "hello world" code to try ...
- 4442 Views
- 4 replies
- 2 kudos
Latest Reply
I did this using Azure web app and exposed the APIs , was able to access that in Post Man and Data bricks. Not used python app on data bricks
3 More Replies