Resolved! Why do I see my job marked as failed on the Databricks Jobs UI, even though it completed the operations in the application
I have a jar job running migrated from EMR to Databricks. The job runs as expected and completes all the operations in the application. However the job run is marked as failed on the Databricks Jobs UI.
- 1956 Views
- 1 replies
- 0 kudos
Latest Reply
Usage of spark.stop(), sc.stop() , System.exit() in your application can cause this behavior. Databricks manages the context shutdown on its own. Forcefully closing it can cause this abrupt behavior.
- 0 kudos