3 weeks ago
Hi,
I have a DLT pipeline that applies changes from a source table (cdctest_cdc_enriched) to a target table (cdctest), by the following code:
2 weeks ago
Hi Kaniz,
The DLT pipeline runs without errors, and all changes of type 1 and 2 (inserts and deletes) are actually applied correctly. But, after checking the target table, apparently, the updates were not reflected in the target. I have since created a workaround by deleting the rows in the source where operation = 3 (update rows with values before update) and replacing all instances of 4 with the string 'UPDATE' in the operation column (I had to change the data type of the operation column to a string and set apply_as_deletes parameter to '1' instead of 1. This fixed it and it now processes inserts, deletes and all updates.
2 weeks ago
Hi @Anske, It seems you’re encountering an issue with your Delta Live Tables (DLT) pipeline where updates from the source table are not being correctly applied to the target table.
Let’s troubleshoot this together!
Pipeline Update Process: When you run a pipeline update in DLT, it performs the following steps:
Update Types: The behaviour of pipeline updates depends on the update type:
Troubleshooting Steps:
stored_as_scd_type
parameter (e.g., 1 for...1.@dlt.table
3.Validation:
Please review these steps, and if the issue persists, provide additional details about your pipeline configuration and any error messages you encounter. We’ll continue troubleshooting from there! 😊
2 weeks ago
Hi Kaniz,
The DLT pipeline runs without errors, and all changes of type 1 and 2 (inserts and deletes) are actually applied correctly. But, after checking the target table, apparently, the updates were not reflected in the target. I have since created a workaround by deleting the rows in the source where operation = 3 (update rows with values before update) and replacing all instances of 4 with the string 'UPDATE' in the operation column (I had to change the data type of the operation column to a string and set apply_as_deletes parameter to '1' instead of 1. This fixed it and it now processes inserts, deletes and all updates.
2 weeks ago
Hi @Anske, That's great to hear that you were able to find a workaround for the issue you were facing with the DLT pipeline. Thank you for sharing the details of the workaround with us. It's always helpful to hear about the solutions that our users come up with. If you have any other questions or need further assistance, please feel free to reach out.
2 weeks ago
Hi @Kaniz ,
Thanks for that, I actually would love some assistance. When I was at the databricks intelligence day in early April, I asked the guy giving the workshop about mirroring some tables from a sql server application database to the delta lake with databricks. He told me that DB will release a feature offering exactly this functionality in Q2 (he thought it would be May), and he advised me to reach out to our account contact for more info. I have tried reaching out to Claire Nicholl who is supposed to be our responsible account executive (I got redirected to her by Anna Cumbelich), by emailing her. I did this the 18th of April but I am still waiting for a reply. Could you tell me anything about this new feature and/or redirect me to the right person?
Another question that I have is, would it be possible to get some actual support on databricks? I found the page listing the support plans (https://www.databricks.com/support), but there is no info on the page about costs, or how to subscribe to any of the plans.
2 weeks ago
Hi @Anske,
2 weeks ago
Hi @Kaniz ,
mail for pricing info on support plans is sent.
With regard to the preview release of Delta Lake 3.0 Delta Universal Format (UniForm), I have read the release notes but fail to see how this helps in any way towards mirroring data from a sql server instance, could you please explain?