Hello everyone, nice to greet you. I have a question about the data lifecycle in ADLS. I know ADLS has its own rules, but they aren't working properly because I have two ADLS accounts: one for hot data and another for cool storage where the information is archived. For this, I have connections to the external catalog with ADLS and Databricks, and when it deposits the information, it does so via Parquet files and the Delta Log. However, when I try to copy the data using ADF and empty my HOT Delta Lakehouse, everything fails because of the process within the Delta Log. Is there a way to copy the data from a HOT ADLS to a COOL ADLS so I can empty the HOT ADLS while continuing to produce new data and archive it in COOL ADLS for later use when needed?