If youโre using Delta Live Tables to simplify ETL processes and manage data pipelines at scale, thereโs great news: You can now access query history and query profiles to gain deep insights into your pipeline runs. This powerful feature enables you to:
โ
Debug queries effortlessly.
โ
Identify performance bottlenecks.
โ
Optimize your pipeline execution.
Hereโs how you can make the most of this feature:
Prerequisites
1๏ธโฃ Your pipeline must run in the preview channel (Delta Live Tables runtime channels).
2๏ธโฃ It must be configured in triggered mode.
Accessing Query History
1๏ธโฃ From the Query History UI:
Click the Query History icon in the Databricks sidebar.
Use the Compute filter to select Delta Live Tables (DLT) compute.
Click a query to view summary details, such as duration and aggregated metrics.
Dive deeper by clicking See query profile, where youโll find granular query performance insights.
2๏ธโฃ From Notebooks:
Open a notebook attached to your DLT pipeline.
At the bottom of the notebook, switch to the DLT Query History tab.
Click a query name for detailed insights, such as execution duration, query source, and metrics.
3๏ธโฃ From the DLT Pipeline UI:
Navigate to the Pipeline Details page for your pipeline.
Select the Query History tab at the bottom of the screen.
Click on any query to view its detailed profile.
Why This Matters
Understanding query histories helps data teams identify inefficiencies, debug faster, and improve data flow performanceโall while working directly in the familiar Databricks ecosystem.
Limitations
โ ๏ธ Provisioning and queued times are currently unavailable.
โ ๏ธ Metrics update live during execution, but the full query profile becomes available only after execution finishes.
๐ก How to Enable: Workspace admins can activate this feature from the Previews page in the Databricks admin console.
Are you ready to elevate your Delta Live Tables workflows? Share your experience in the comments ๐
#Databricks #DeltaLiveTables #ETL #DataEngineering #DataOps
Ajay Kumar Pandey