Hey @juanjomendez96 ,
You’ve explained this really well, and yes, what you’re experiencing is currently one of the main limitations of Databricks Apps.
Right now, these apps run on fixed managed compute controlled by Databricks. That means we, as users, don’t get the option to pick or scale the instance type.
The idea behind it is to make apps quick to deploy and lightweight, but it also means they can struggle when you’re working with large datasets or heavy computations.
In your cloud setup, you could easily switch to a bigger instance and get more power, but with Databricks Apps, that control isn’t available yet.
What You Can Try
--> Move heavy logic outside the app. You can run big joins or transformations in a separate job or SQL warehouse and let the app read from the preprocessed Delta table instead.
--> Use SQL warehouses for dashboards. If your app mostly displays query results, you can connect it to a SQL warehouse and scale that up when needed.
--> Optimize your data. Cache or pre-aggregate data so the app only loads what’s needed.
Databricks Apps are great for saving DevOps time and building internal dashboards quickly, but at the moment, they’re better suited for light to medium workloads.
For larger workloads, combining them with a SQL warehouse or an external job is usually the best way to go.
I’d also suggest sending this feedback through your Databricks account team or the feedback portal — several users have already requested configurable compute and higher secret
limits, so more voices definitely help.
Hope that helps clarify things a bit...
harisankar