โ08-11-2024 08:20 PM
I am currently managing nearly 300 tables from a production database and considering moving the entire ETL process away from Azure Data Factory to Databricks.
This process, which involves extraction, transformation, testing, and loading, is executed daily.
Given this context, I am unsure whether it's more efficient to:
My questions are:
Additional context:
Thank you for your insights!
โ08-11-2024 09:38 PM
Hi,
Instead of 300 individual files or one massive script, try grouping similar tables together. For example, you could have 10 scripts, each handling 30 tables. This way, you get the best of both approchesโThis way you will have a freedom of easy debugging without having too many files to manage.
Start with Notebooks and once everythingโs running smoothly, consider converting your notebooks into .py scripts.
One more tip - look into using Delta Lake in Databricks. It makes managing your data easier and more reliable.
Give a try.
โ08-11-2024 09:38 PM
Hi,
Instead of 300 individual files or one massive script, try grouping similar tables together. For example, you could have 10 scripts, each handling 30 tables. This way, you get the best of both approchesโThis way you will have a freedom of easy debugging without having too many files to manage.
Start with Notebooks and once everythingโs running smoothly, consider converting your notebooks into .py scripts.
One more tip - look into using Delta Lake in Databricks. It makes managing your data easier and more reliable.
Give a try.
โ08-13-2024 04:26 PM
Thank you Brahmareddy!
Not too sure why I never thought of that ๐!
โ08-13-2024 05:15 PM
You are welcome, Joeyong!. Good day.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group