cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Optimal process for loading data where the full dataset is provided every day?

oakhill
New Contributor III

We receive several datasets where the full dump is delivered daily or weekly. What is the best way to ingest this into Databricks using DLT or basic PySpark while adhering to the medallion?

1. If we use AutoLoader into Bronze, We'd end up with incrementing the bronze table with 100,000 rows evey day (with 99% duplicates).

How would we then move changes or additions downstream?

4 REPLIES 4

Witold
Honored Contributor

In case of full loads, you only need to pass

  .mode('overwrite')

while writing to your Bronze table. This is not related to Auto Loader.

oakhill
New Contributor III

Won't this cause troubles with CDC in the silver layer because the entire dataset is new? Or will it remember what lines from the bronze it already has read even though it's overwritten?

Witold
Honored Contributor

Depends on your logic. The source sends you a full load, this might mean that you need to reprocess everything, also in all downstream layers.

If the source only sends you a full load, because it's not capable of identifying changes then you should do CDC as early as possible. And usually a MERGE INTO with merge-conditions and update-conditions will help you.

dbrx_user
New Contributor III

Agree with @Witold to apply CDC as early as possible. Depending on where the initial files get deposited, I'd recommend having an initial raw layer to your medallion which is just your cloud storage account - so each day or week the files get deposited here. From there you can pull it into your Bronze layer using MERGE INTO to only pull the new / latest data

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group