@angel_ba - This is expected/designed behaviour.
Audit logs are ingested into the system tables asynchronously. Databricks batches these events befor surfacing them in UC system tables.
Alternate (prhaps) the best way is to use Job API for start/completion time (I presume you are using Jobs for the pipeline). it updates almost instantly.
Second alternate way is DLT event logs itself in the DLT Catalog.schema.event_log
RG #Driving Business Outcomes with Data Intelligence