cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DLT Publish event log to metastore

ankit001mittal
New Contributor III

Hi Guys,

I am trying to use DLT Publish event log to metastore feature.

ankit001mittal_0-1745328628320.png

and I noticed it creates a table with the logs for each DLT pipelines separately. Does it mean it maintains the separate log table for all the DLT tables ( in our case, we have 1000s of tables through DLT ) ? or is there a way to include all the logs in 1-2 DLT logs table, like we have job and job task run timelines tables in system.lakeflow schema. 
Please let me know if my approach is wrong here, and if there is an efficient way to handle this.

 

1 REPLY 1

SP_6721
New Contributor III

Hi @ankit001mittal 

Yes, you're right, when you enable the "Publish Event Log to Metastore" option for DLT pipelines, Databricks creates a separate event log table for each pipeline. So, if you have thousands of pipelines, you'll see thousands of log tables. Thereโ€™s no built-in feature at the moment to automatically combine all logs into a single centralised table like the system tables.

Your approach is valid. If you want a unified view of all logs across pipelines, you can create a SQL view that uses UNION ALL across the individual event log tables. Or set up a streaming process that regularly consolidates logs into one Delta table for easier querying and dashboarding.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now