cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT use case

Brant_Seibert_V
New Contributor II

Is Delta Live Tables (DLT) appropriate for data that is in the millions of rows and GB sized?  Or is DLT only optimal for larger data with billions of rows and TB sized?

Please consider the Total Cost of Ownership.

  1. Development costs (engineering time)
  2. Operational costs (clusters, network, etc.)
  3. Maintenance costs

 

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @Brant_Seibert_VDelta Live Tables (DLT) is designed to handle both small and large datasets efficiently. It uses Delta Lake's underlying technology, which is built to handle datasets ranging from GBs to PBs in size. DLT is not limited to only large datasets with billions of rows and TB-sized. It can also effectively manage data in millions of rows and GB-sized.

However, in terms of the Total Cost of Ownership, you need to consider the following:

1. Development costs (engineering time): DLT uses declarative syntax to define and manage DDL, DML, and infrastructure deployment, which can reduce development time and costs.

2. Operational costs (clusters, network, etc.): DLT requires a particular runtime version, which may impact the operational costs depending on your current Databricks setup.

3. Maintenance costs: DLT abstracts away complexities associated with the efficient application of updates, allowing users to focus on writing queries, which can reduce maintenance costs.

Remember, the cost-effectiveness of DLT would depend on your specific use case and requirements.

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @Brant_Seibert_VDelta Live Tables (DLT) is designed to handle both small and large datasets efficiently. It uses Delta Lake's underlying technology, which is built to handle datasets ranging from GBs to PBs in size. DLT is not limited to only large datasets with billions of rows and TB-sized. It can also effectively manage data in millions of rows and GB-sized.

However, in terms of the Total Cost of Ownership, you need to consider the following:

1. Development costs (engineering time): DLT uses declarative syntax to define and manage DDL, DML, and infrastructure deployment, which can reduce development time and costs.

2. Operational costs (clusters, network, etc.): DLT requires a particular runtime version, which may impact the operational costs depending on your current Databricks setup.

3. Maintenance costs: DLT abstracts away complexities associated with the efficient application of updates, allowing users to focus on writing queries, which can reduce maintenance costs.

Remember, the cost-effectiveness of DLT would depend on your specific use case and requirements.

Kaniz_Fatma
Community Manager
Community Manager

Hi @Brant_Seibert_V , Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!