Canยดt speak for others, but for me it is a combination of things:
- autoscaling
- tuned spark clusters
- extra features like Databricks SQL, MLFlow etc
- frequent updates
- reliability
I probably forget a few things, but what is also a great asset is (and this is mainly Spark related and not only Databricks) that you can use python/scala instead of SQL.
Donยดt get me wrong: sql is an excellent tool, but when things get complicated I prefer a GP programming language over SQL.