My team has a workflow that currently runs with databricks sql using a standalone cluster. We are trying to switch this job to using a sql warehouse but I keep getting errors. The current job runs in a for-each loop to break up the work into smaller parts and only work on one partition of the data at a time and then it uses dynamic overwrite mode to only update that partition. When I try to do something similar on a sql warehouse with either `replace where` or by setting the overwrite mode (documented here) I get errors because it doesn't recognize the replace where keywords nor the config values, respectively. So is this just not possible when using sql warehouses? Is it always all or nothing? It seems like the only way to do something like this would be a two-step process where you first delete anything matching the partition key, then insert the new data in (rather than a one-step insert-overwrite dynamic or replace where).