I am running a Delta Live Pipeline that explodes JSON docs into small Delta Live Tables. The docs can receive multiple updates over the lifecycle of the transaction. I am curating the data via medallion architecture, when I run an API /update with
{"full_refresh":"true"}
it resets checkpoints and runs fine, when I try to perform INCREMENTAL I am getting the following error:
org.apache.spark.sql.streaming.StreamingQueryException: Query dlt_fulfillment_tickets [id = 7c256d93-6271-4013-9d5d-fe356c18511f, runId = 1aba276e-1118-43c1-b1fa-85e688bf523b] terminated with exception: Detected a data update (for example part-00000-ba0db042-39f9-450b-ad19-3f05afb52830-c000.snappy.parquet) in the source table at version 10. This is currently not supported. If you'd like to ignore updates, set the option 'ignoreChanges' to 'true'. If you would like the data update to be reflected, please restart this query with a fresh checkpoint directory.
Is there a way to set the above option via SQL? My entire pipeline is in SQL.