Reading table changes using a greater timestamp or version than the last table commit throws an error and can be changed using a flag timestampOutOfRange.enabled,
My issue is that I use an SQL endpoint and I don't see any way of providing this spark flag
from table_changes(
'table_name',
'2023-05-17' -- greater then latest table commit
)
https://docs.databricks.com/delta/delta-change-data-feed.html