08-11-2024 08:20 PM
I am currently managing nearly 300 tables from a production database and considering moving the entire ETL process away from Azure Data Factory to Databricks.
This process, which involves extraction, transformation, testing, and loading, is executed daily.
Given this context, I am unsure whether it's more efficient to:
My questions are:
Additional context:
Thank you for your insights!
08-11-2024 09:38 PM
Hi,
Instead of 300 individual files or one massive script, try grouping similar tables together. For example, you could have 10 scripts, each handling 30 tables. This way, you get the best of both approches—This way you will have a freedom of easy debugging without having too many files to manage.
Start with Notebooks and once everything’s running smoothly, consider converting your notebooks into .py scripts.
One more tip - look into using Delta Lake in Databricks. It makes managing your data easier and more reliable.
Give a try.
08-11-2024 09:38 PM
Hi,
Instead of 300 individual files or one massive script, try grouping similar tables together. For example, you could have 10 scripts, each handling 30 tables. This way, you get the best of both approches—This way you will have a freedom of easy debugging without having too many files to manage.
Start with Notebooks and once everything’s running smoothly, consider converting your notebooks into .py scripts.
One more tip - look into using Delta Lake in Databricks. It makes managing your data easier and more reliable.
Give a try.
08-13-2024 04:26 PM
Thank you Brahmareddy!
Not too sure why I never thought of that 🙄!
08-13-2024 05:15 PM
You are welcome, Joeyong!. Good day.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now