Hi @Retired_mod , @Avnish_Jain How can I implement cdc in sql dlt pipelines with a live table (not streaming) . I am trying to implement below where i am reading from external tables, loading data into bronze layer and then want to apply these change...
Hi All,I am unable to create delta table with partitioning option, can someone please correct me what I am missing and help me with updated query CREATE OR REPLACE TABLE invoice
USING DELTA
PARTITION BY (year(shp_dt), month(shp_dt))
LOCATION '/ta...
Hi @Retired_mod - how to log number of rows read and written in dlt pipeline, I want to store it in audit tables post the pipeline update completes. Can you give me sample query code ?
What is best practise to implement parameterization in SQL DLT (specifically) pipelines so that it's easy and no manual intervention would be potentially required to migrate from dev_region to prod_region
Thanks @Retired_mod but I asked on how to log number of rows/written via a delta live table (DLT) pipeline, not a delta lake table and the solution you gave is related to data factory pipeline which is not what I need.
Below is the error I am getting, I am extracting data using queries into a live table (MV) and then intend to use it to merge changes into a streaming silver table, need to know the correct approach, the official documentation is unclearError details...
One possible solution could be to handle the deserialization of the Protobuf messages differently. Instead of using a deserializer, you could use a ByteArrayDeserializer and convert it in your listener instead. Then, you could use a ByteArraySerializ...