Dear DataBricks people,
We are currently measuring DLT performance and cost on a medallion architecture with 150 to 300 tables, and we're interested in adding even more tables.
I've been doing automated incremental streaming DLT pipelines every 3 hours through the night, and was pleasantly surprised this morning to see that the "setting up tables" (aka SETTING_UP_TABLES) stage for the 300 tables case went from 27 minutes down to 14 minutes when the preview runtime upgraded from `dlt:15.4.4-delta-pipelines-dlt-release-2024.42-rc0-commit-10aaba0-image-c48da6f` to `dlt:15.4.4-delta-pipelines-dlt-release-2024.44-rc1-commit-1a62345-image-ba3b9ec`.
Thank you for this already substantial speed-up, but please do keep on improving it if at all possible!
I also hope that you have some good performance testing in your regression testing suite.
Relevant previous thread on this forum: https://community.databricks.com/t5/data-engineering/delta-live-tables-too-much-time-to-do-the-quot-...