Hi @faruko !Yes why not  but only if the backup or export has a clear consistent cutoff point and the continuous ingestion starts from that exact point ideally based on an Oracle SCN not just whatever was in the backup. I would not rely only on the ...
Hello @ismaelhenzel AFAIK, there is no documented native UC foreign catalog integration for a generic Iceberg REST catalog such as BigLake REST Catalog today.DBKS does support Iceberg in UC including UC managed and foreign Iceberg tables but the docu...
Hi @faruko !My idea is to treat the initial load as a controlled batch backfill then start the CDC pipeline afterwards from a clear cutoff point.You define a fixed cutoff timestamp or Oracle SCN for the initial snapshot and later load history in sma...
Hello @shan-databricks !One additional point, I would also validate the expected load with the SQL Server DBA because even if Lakeflow manages the parallelism internally the source SQL Server still needs to handle those concurrent reads. For 100 tab...
Hello @Darshan137 !Few things I will add to @Lu_Wang_ENB_DBX answer that I did on a similar project.If ADF currently passes values such as environment, run date, catalog, schema, or business domain, define a clear parameter contract in Lakeflow Job...