cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DLT overwrite part of the table

erigaud
Honored Contributor

Hello !

We're currently building a pipeline of file ingestion using a Delta Live Tables pipeline and autoloader. 

The bronze tables are pretty much the following schema : 

file_name | file_upload_date | colA | colB 

(Well, there are actually 250+ columns but you get the idea)

The bronze table is append only, with possibly some duplicates because some files can be uploaded several times with corrections, but they will have the same name. The logic I'm trying to implement table is the following: 

- A file is loaded in bronze, lets say 500 rows with file_name = file_name_A.csv and the corresponding upload_date (that part is fine, just standard auto-loader)

- In silver we already had some rows (lets say 1000) for that file_name, but an older upload_date. In that case we want to replace all the 1000 rows by the newer 500 rows. 

How would someone go about doing something like this using Delta Live Table ? 

Thank you !

 

1 ACCEPTED SOLUTION

Accepted Solutions

Tharun-Kumar
Databricks Employee
Databricks Employee

@erigaud 

Using jobs/workflows would be the right choice for this.

View solution in original post

4 REPLIES 4

erigaud
Honored Contributor

Hello @Retired_mod. I am not sure MERGE INTO is the right solution to my problem, as I do not have a unique key in that situation. So I will have many rows with the same file_name that will match many rows in my table. 

Do you have a better solution ?

Tharun-Kumar
Databricks Employee
Databricks Employee

@erigaud 

You could get the distinct file name from the new set of records and remove all it's entries from your silver table. We could then have them appended to the silver table.

erigaud
Honored Contributor

@Tharun-Kumar 

That's the solution I was thinking of, but is there a clean way to do that using DLT or should I just use a regular notebook task and simple Delta Tables ? 

Tharun-Kumar
Databricks Employee
Databricks Employee

@erigaud 

Using jobs/workflows would be the right choice for this.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group