cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Informatica ETLs

DataYoga
New Contributor

I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Databricks' modern data analytics capabilities.

Have you faced hurdles in this transformation process? What strategies or tools have you found effective in converting Informatica workflows for use in Databricks? Any shared experiences or insights would be incredibly valuable as we navigate this transition.

Looking forward to your thoughts and advice.

Best,

John Reuben

DataYoga.io

4 REPLIES 4

owene
New Contributor II

@Retired_mod Do you know if Informatica Cloud Modernization can convert mappings into Delta Live Tables? Do we have to use Informatica Cloud for this, or can we use it as a one time migration and maintain the artifacts in DataBricks?

Alternatively, we are looking at using dbt with DataBricks SQL. Is that supported as well?

you may explore the tool and services from Travinto Technologies . They have very good tools. We had explored their tool for our code coversion from  Informatica, Datastage and abi initio to DATABRICKS , pyspark. Also we used for SQL queries, stored procedure, trigger, cursor etc.

Kalyan4
New Contributor II

Yes, this is a common challenge. Most teams avoid 1:1 conversion and instead rationalize ETLs first, then rebuild using Spark-native patterns (joins, lookups, SCDs) with strong validation. Weโ€™ve seen good results combining this approach with Kanerikaโ€™s FLIP platform, which helps automate large parts of Informatica-to-Databricks migration while preserving business logic and reducing manual effort. Curious whether youโ€™re dealing more with PowerCenter or BDM complexity.

zalmane
Visitor

We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and finally renders SQL queries in Spark SQL dialect. It has been working very well for us. Happy to provide more info. This is for Informatica PowerCenter mappings.