Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
We are trying to migrate to Delta Live Tables an Azure Data Factory pipeline which loads CSV files and outputs Delta Tables in Databricks.The pipeline is triggered on demand via an external application which places the files in a Storage folder and t...
@Enric Llop :When using Delta Live Tables to perform a "rip and replace" operation, where you want to replace the existing data in a table with new data, there are a few things to keep in mind.First, the apply_changes function is used to apply chang...
Hi there,I am using apply_changes (aka. Delta Live Tables Change Data Capture) and it works fine. However, it seems to automatically create a secondary table in the database metastore called _apply_storage_changes_{tableName}So for every table I use ...
I have read that delta live tables will keep a history of 7 days. However after creating a streaming live table and using the dlt.apply_changes function. With this codedef run_pipeline(table_name,keys,sequence_by):
lower_table_name = table_name.l...
Hi @Logan Nicol Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...