- 1503 Views
- 4 replies
- 0 kudos
Hi all,Environment:Nodes: Standard_E8s_v3Databricks Runtime: 9.0.NET for Apache Spark 2.0.0I'm invoking spark submit to run a .Net Spark job hosted in Azure Databricks. The job is written in C#.Net with its only transformation and action, reading a C...
- 1503 Views
- 4 replies
- 0 kudos
Latest Reply
Hi @Timothy Lin​ ,I will recommend to not use spark.stop() or System.exit(0) in your code because it will explicitly stop the Spark context but the graceful shutdown and handshake with databricks' job service does not happen.
3 More Replies
- 759 Views
- 1 replies
- 0 kudos
Different than all-purpose clusters, the databricks job new cluster configuration window does not have a "Libraries" tab, in which specific python modules could be installed. What's the best practice for installing python modules on such clusters?
- 759 Views
- 1 replies
- 0 kudos
Latest Reply
It turns out that the option exists outside of the cluster configuration scope, in the task configuration window itself - under "Advanced options" -> "Add dependent libraries".
- 2996 Views
- 5 replies
- 1 kudos
I want to get a mail notification at the end of each day for when my Databricks job has finished running and for that I need to extract the time of it's completion and it's status. How can I achieve that?
- 2996 Views
- 5 replies
- 1 kudos
Latest Reply
Hi @Yatharth Kaushik​ you can use the JobsRunList API to get all the information of the job run. You can write a code to extract the information that you need for the table.The are multiple API's in the same doc that you can use to get information a...
4 More Replies