cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Lakebridge code conversion from oracle to databricks sql

rahult1407
New Contributor

Hi Community,

I’m working on migrating several Oracle views to SparkSQL using the Databricks Labs Lakehouse Bridge tool. 
 I’m facing issues while converting the code .

For oracle views and materialized views .

Problems I’m encountering:

The converted SQL sometimes contains syntax or functions not supported in SparkSQL.

Any guidance or resources would be appreciated!

Thanks,
Rahul

 

3 REPLIES 3

Louis_Frolio
Databricks Employee
Databricks Employee

Greetings @rahult1407 , I did some digging and would like to offer some helpful hints to guide you through your conversion. 

Databricks Lakebridge (formerly Lakehouse Bridge) is a powerful migration tool, but Oracle to SparkSQL conversion—especially for views and materialized views—presents several challenges that require both automated assistance and manual intervention.

Common Conversion Challenges

Automated conversion tools like Lakebridge typically achieve around 75% conversion rates, with the remaining 25% requiring manual rework. Oracle and Databricks SQL procedure/view conversion is notoriously difficult due to fundamental differences in SQL dialects, built-in functions, data types, and architectural approaches.

Key Function and Syntax Differences

Here are critical Oracle to SparkSQL mappings you'll need to address:

- NVL/ISNULL → coalesce()
- SYSDATE → current_timestamp()
- ROWNUM → row_number() window function
- INSTR (4-parameter version) → substring_index() or modified locate() functions
- TO_DATE → unix_timestamp() or to_timestamp() with format strings
- Oracle NUMBER type → DECIMAL with max precision of 38, or DOUBLE for higher precision

Handling Views and Materialized Views

For regular views, the conversion process involves translating the view definition SQL from Oracle syntax to SparkSQL syntax, then recreating them in Databricks. For materialized views, Databricks supports them natively through Lakeflow pipelines that automatically manage refresh operations.

Recommended Approach

Rather than relying solely on automated conversion, adopt this hybrid strategy:

1. Categorize your views - ETL-focused views should convert to Delta Live Tables, while reporting views can become parameterized queries or dashboards
2. Manual conversion for complex logic - Most successful migrations involve significant manual work, especially for procedural logic
3. Test with representative data - Validate both syntax conversion and semantic equivalence

At the end of the day these migrations are involved and nuanced.  Lakebridge is a great tool that will get you through 75% of what needs to be converted, the rest will involved manual intervention to address the edge cases.

Hope this helps, Louis.

Migrating your legacy Oracle data warehouse to the Databricks Data Intelligence Platform can accelerate your data modernization journey. In this session, learn the top strategies for completing this data migration. We will cover data type conversion, basic to complex code conversions, validation ...

Hubert-Dudek
Esteemed Contributor III

Not a very good answer, but I was facing the same issues with the tool. I need to give it a second chance soon.

Raman_Unifeye
Contributor III

@Louis_Frolio - will there be similar guidelines for other 'source' code too such as T-SQL, Teradata. Any such comprhensive docs on Laeebridge to cover per source.


RG #Driving Business Outcomes with Data Intelligence

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now