I am creating new application and looking for ideas how to handle exceptions in Spark, for example ThreadPoolExecution. Are there any good practice in terms of error handling and dealing with specific exceptions ?
I recently had conversation with Databricks architect, and he make me realised that Databricks customers are mostly using Python and they invest in this language. If you look at Delta Live tables, written in Scala but users can only use it with Pytho...