Lake Bridge ETL Rehouse into AWS Data bricks options ?
==========================================
Hi Community experts?
Thanks for replies to my threads.
We reviewed the Lake Bridge thread opened here. The functionality claimed, it can convert on-prem ETL (Informatica ) to data bricks notebooks and run the ETL within Cloud data bricks framework.
How does this work?
Eg Informatica artifacts on on-prem has
bash scripts (driving scripts)
Mapping
Sessions
Workflows
Scheduled jobs
How will the above INFA artifacts land/sit in Databricks framework in cloud?
INFA support heterogeneous legacy data source (Many DBs, IMF, VSAM, DB2, Unisys DB etc) connectivity/configurations.
Currently we know, we need a mechanism to land data into S3 for Data bricks to consume from S3 to load into Data bricks.
How will Lake bridge converted INFA ETL bring data from legacy data source to S3 for data bricks consumption?
Thanks for your guidance.