Hi @Brant_Seibert_V, Delta Live Tables (DLT) is designed to handle both small and large datasets efficiently. It uses Delta Lake's underlying technology, which is built to handle datasets ranging from GBs to PBs in size. DLT is not limited to only large datasets with billions of rows and TB-sized. It can also effectively manage data in millions of rows and GB-sized.
However, in terms of the Total Cost of Ownership, you need to consider the following:
1. Development costs (engineering time): DLT uses declarative syntax to define and manage DDL, DML, and infrastructure deployment, which can reduce development time and costs.
2. Operational costs (clusters, network, etc.): DLT requires a particular runtime version, which may impact the operational costs depending on your current Databricks setup.
3. Maintenance costs: DLT abstracts away complexities associated with the efficient application of updates, allowing users to focus on writing queries, which can reduce maintenance costs.
Remember, the cost-effectiveness of DLT would depend on your specific use case and requirements.