Hi Guys,
I am trying to use DLT Publish event log to metastore feature.

and I noticed it creates a table with the logs for each DLT pipelines separately. Does it mean it maintains the separate log table for all the DLT tables ( in our case, we have 1000s of tables through DLT ) ? or is there a way to include all the logs in 1-2 DLT logs table, like we have job and job task run timelines tables in system.lakeflow schema.
Please let me know if my approach is wrong here, and if there is an efficient way to handle this.