โ10-23-2024 12:24 AM
I have a DLT pipeline running daily for months and recently found out one issue in my silver layer code and as a result of that, now I have faulty data in my silver schema. Please note that the tables in Silver schema are streaming tables handled within the context of DLT.
I want to delete the faulty records from my silver schema and keep the pipeline running with the correct code as it's running normally. Could someone please suggest me what would be the best possible approach for me to update my silver schema streaming tables by deleting faulty records? Early response would be highly appreciated.
โ10-23-2024 01:14 AM
Use Delta Lake's DELETE command to remove the faulty records from your Silver tables. You can do this in a Databricks notebook or a separate script.
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("DeleteFaultyRecords").getOrCreate()
# Define the criteria for faulty records
faulty_criteria = "your_faulty_criteria_here"
# List of Silver tables to clean
silver_tables = ["silver_table1", "silver_table2"]
for table in silver_tables:
spark.sql(f"DELETE FROM {table} WHERE {faulty_criteria}")
Also, Ensure that your DLT pipeline code is corrected to prevent future faulty records. Update the transformation logic in your Silver layer to handle the data correctly. (spark dataframe transformations or constraints)
โ10-23-2024 01:26 AM
Hi Fatimah,
You can delete the records from silver layer as long as those records don't get reloaded again (from bronze). More info is here
May I also understand that are you are using CDC apply_changes method for loading data to silver (like SCD 1)? If not, definitely above can be done.
Cheers
โ10-23-2024 01:33 AM
Yes, I'm using CDC apply_changes method with SCD type 1
โ10-25-2024 02:04 PM
In that case, what's the expression in the "apply_as_delete" option? Or please share your CDC apply_changes code block?
โ10-27-2024 11:37 PM
I do not have "apply_as_delete" option.
Here's my apply_changes code
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group