cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Delta Live Tables use case

Floody
New Contributor II

Hi all,

We have the following use case and wondering if DLT is the correct approach.

Landing area with daily dumps of parquet files into our Data Lake container.

The daily dump does a full overwrite of the parquet each time, keeping the same file name.

The idea would be to re-process the whole parquet each time and manage the delta in the bronze table with SCD 2.

Suggestions on the best approach would be helpful.

Cheers

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @FloodyLetโ€™s explore how Delta Lake (DLT) can be a suitable approach for your use case.

  1. Delta Lake Overview:

    • Delta Lake is an open source storage layer that brings ACID transactions to Apache Sparkโ„ข and big data workloads. It provides reliability, performance, and scalability for data lakes.
    • Itโ€™s built on top of existing data lakes and is compatible with Parquet, ORC, and other data formats.
    • Key features include ACID transactions, schema enforcement, time travel, and data versioning.
  2. Your Use Case:

    • You have a daily dump of Parquet files into your data lake container.
    • The daily dump overwrites the Parquet files each time, but the file names remain the same.
  3. Approaches to Consider:

  4. Recommendation:

    • Considering your scenario, CLONE Parquet or CONVERT TO DELTA could be suitable.
    • If you need incremental support and want to maintain the existing partitioning structure, go for CLONE Parquet.
    • If you prefer a straightforward conversion without incremental support, choose CONVERT TO DELTA.

Remember to adapt the approach based on your specific requirements and workload characteristics. Delta Lake provides flexibility and robustness for managing data in your data lake container. ๐ŸŒŸ๐Ÿ”—

For more detailed implementation steps, refer to the Azure Databricks documentation1.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group