Hi everyone,
I regularly (once a week) run the analyze table compute statistics command on all my tables in Databricks to keep statistics up to date for query optimization.
In the Spark table UI catalog, I can see some statistics metadata like spark.statistics and spark.statistics.createdAt. However, when I query the table properties or use commands like describe table extended or check the table metadata programmatically, I can only retrieve statistics such as table size and row count — but not the spark.statistics.createdAt value indicating when the statistics were last computed.

I need to store this updated time in another table . Could anyone suggest a reliable method or the right command to retrieve the exact time when statistics were last updated or created for a table in Databricks/Spark?
Thanks in advance!