- 4898 Views
- 2 replies
- 1 kudos
Latest Reply
I'll try to answer this in the simplest possible way 1. Spark is an imperative programming framework. You tell it what it to do, it does it. DLT is declarative - you describe what you want the datasets to be (i.e. the transforms), and it takes care ...
- 1 kudos