What is something special about databricks.What databricks provides that no other tool in the market provides ?How can I convince some other person to use databricks and not some other tool ?
If my cluster memory is 1GB for example and my data is 1TB how Spark will handle it?If it is in memory computing how does it handles the data that is greater than the memory size ?
I read somewhere that Python code is converted to SQL at the end. So is it true or there is any difference in performance while working with Scala, Python or SQL ?
What will happen if a driver node will fail?What will happen if one of the worker node fails?Is it same in Spark and Databricks or Databricks provide additional features to overcome these situations?
Can you provide any resource where I would be able to look into it ?Just wondering is python code converted to SQL at the end ?Or as the other person mentioned it is converted to scala.