Hi @Sreejuv
You're encountering a very common challenge. Oracle and Synapse to Databricks SQL procedure conversion is notoriously difficult, and many organizations struggle with this.
Common Issues with Automated Conversion
Why procedures often fail to convert:
- Syntax differences (Oracle PL/SQL vs T-SQL vs Databricks SQL)
- Procedural logic doesn't translate well to Databricks' SQL dialect
- Built-in functions have different names/behaviors
- Control flow structures (loops, cursors, exception handling)
- Data types and precision differences
Successful Conversion Strategies
1. Manual Conversion Approach
- Most successful migrations involve manual conversion.
2. Hybrid Approach - Python/Scala + SQL
3. Delta Live Tables for ETL Procedures
Successful Migration Patterns
1. Categorize Your Procedures
- ETL procedures โ Convert to Delta Live Tables or Databricks workflows
- Business logic procedures โ Convert to Python/Scala functions
- Data validation procedures โ Convert to data quality checks
- Reporting procedures โ Convert to parameterized queries or dashboards
2. Common Conversion Mappings
- Oracle/Synapse โ Databricks:
-- Cursors โ DataFrame operations
-- SYSDATE/GETDATE() โ current_timestamp()
-- NVL/ISNULL โ coalesce()
-- ROWNUM โ row_number()
-- Exception handling โ try/catch in Python
-- Loops โ DataFrame transformations or Python loops
3. Tools That Actually Work
- Manual conversion (most reliable)
AWS Schema Conversion Tool (limited success)
Custom Python scripts for pattern matching
Databricks Assistant for individual query conversion
Recommended Approach
- Start with simple procedures to understand patterns
- Create a conversion template for common patterns
- Use Databricks notebooks for complex logic
- Implement as workflows rather than stored procedures
- Test thoroughly with representative data
LR