Hi @tarunnagpal !!
Adding to what @MariuszK said,
Using an LLM to accelerate the translation process is a great approach, but if the code is proprietary, it's best to use a closed model.
Implementing a validation process is crucial to ensure that the translated tables in Databricks match the originals in Snowflake.
Since Databricks doesnโt support stored procedures, you can replace them with Notebooks orchestrated by workflows or Delta Live Tables (DLTs). While DLTs may require more translation effort upfront, they can offer long-term benefits.
At SunnyData, weโve developed solutions to streamline steps 1 and 2:
- Weโve been customizing an LLM to improve translation accuracy for these types of migrations. While we havenโt deployed it yet, weโre actively exploring its capabilities.
- Weโve built a solution that gathers statistical insights on tables and performs large-scale comparisons to validate their equivalence, even with massive datasets. It also highlights any discrepancies.
Would love to hear more about your specific migration challenges! Feel free to DM me for any follow-up questions!
Best,
Eliana Oviedo
BD & Partnerships | Strategist
Eliana Oviedo
Technical Partnerships Lead | SunnyData
P: (598) 95-974-524
E: eliana.oviedo@sunnydata.ai