Hi @Faisal, Certainly! To log the number of rows read and written in a DLT (Data Lake Table) pipeline and store it in an audit table after the pipeline update completes, you can follow these steps:
Capture Metrics in Azure Data Factory (ADF):
- In your pipeline, after the data copy activity, add an additional activity (e.g., a Set Variable activity).
- Use the output JSON of the Copy Activity to extract the relevant metrics. Specifically, you can access the rowsRead and rowsCopied properties from the output.
- Store these values in a variable.
Invoke a Stored Procedure:
- Create a stored procedure in your SQL Server database that accepts the row counts as input parameters.
- In your pipeline, use an Execute Stored Procedure activity to invoke this stored procedure.
- Pass the rowsRead and rowsCopied values as parameters to the stored procedure.
- The stored procedure will insert these values into your metadata table.
Example Query Code:
Create a stored procedure to insert row counts into the metadata table
---- Create a stored procedure to insert row counts into the metadata table
CREATE PROCEDURE dbo.InsertRowCounts @sourceRowCount INT, @sinkRowCount INT AS BEGIN INSERT INTO dbo.AuditMetadata (SourceRowCount, SinkRowCount, Timestamp) VALUES (@sourceRowCount, @sinkRowCount, GETDATE()); END;
-- In your pipeline, use the Execute Stored Procedure activity
EXEC dbo.InsertRowCounts @sourceRowCount = @variables('sourceRowCount'), @sinkRowCount = @variables('sinkRowCount');
Note: Replace dbo.AuditMetadata with your actual metadata table name.
Handle Errors:
- Ensure proper error handling in case the stored procedure fails to execute.
- You can use Try-Catch blocks or other mechanisms to handle exceptions.
Remember to adjust the query code according to your specific database schema and naming conventions. If you encounter any issues, feel free to ask for further assistance! ๐