Hi @data_learner1 ,
Audit logs focus on security and usage monitoring such as user access, table read/write events. They don't track schema level changes.
To track schema level changes, delta transaction logs will be the best to use. The transaction log files are typically stored in a _delta_log directory within the table's root directory. Here is a sample code snippet you can use to query the log files for a table.
df = spark.read.json("/path/to/your/delta/table/_delta_log/*.json")
df.select("commitInfo","metaData").display()
Hope this helps!