- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-25-2024 04:37 AM
Hi,
I have a DLT pipeline that applies changes from a source table (cdctest_cdc_enriched) to a target table (cdctest), by the following code:
- Labels:
-
Workflows
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-06-2024 05:46 AM
Hi Kaniz,
The DLT pipeline runs without errors, and all changes of type 1 and 2 (inserts and deletes) are actually applied correctly. But, after checking the target table, apparently, the updates were not reflected in the target. I have since created a workaround by deleting the rows in the source where operation = 3 (update rows with values before update) and replacing all instances of 4 with the string 'UPDATE' in the operation column (I had to change the data type of the operation column to a string and set apply_as_deletes parameter to '1' instead of 1. This fixed it and it now processes inserts, deletes and all updates.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-06-2024 05:46 AM
Hi Kaniz,
The DLT pipeline runs without errors, and all changes of type 1 and 2 (inserts and deletes) are actually applied correctly. But, after checking the target table, apparently, the updates were not reflected in the target. I have since created a workaround by deleting the rows in the source where operation = 3 (update rows with values before update) and replacing all instances of 4 with the string 'UPDATE' in the operation column (I had to change the data type of the operation column to a string and set apply_as_deletes parameter to '1' instead of 1. This fixed it and it now processes inserts, deletes and all updates.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-06-2024 06:45 AM
Hi @Retired_mod ,
Thanks for that, I actually would love some assistance. When I was at the databricks intelligence day in early April, I asked the guy giving the workshop about mirroring some tables from a sql server application database to the delta lake with databricks. He told me that DB will release a feature offering exactly this functionality in Q2 (he thought it would be May), and he advised me to reach out to our account contact for more info. I have tried reaching out to Claire Nicholl who is supposed to be our responsible account executive (I got redirected to her by Anna Cumbelich), by emailing her. I did this the 18th of April but I am still waiting for a reply. Could you tell me anything about this new feature and/or redirect me to the right person?
Another question that I have is, would it be possible to get some actual support on databricks? I found the page listing the support plans (https://www.databricks.com/support), but there is no info on the page about costs, or how to subscribe to any of the plans.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-06-2024 11:52 PM
Hi @Retired_mod ,
mail for pricing info on support plans is sent.
With regard to the preview release of Delta Lake 3.0 Delta Universal Format (UniForm), I have read the release notes but fail to see how this helps in any way towards mirroring data from a sql server instance, could you please explain?

