I have a process that does several database updates in quick succession. It is slower than I believe it should be and I am working over the process, attempting to speed it up.
Tonight it is consistently slow.
The debugger is warning me that thing could be sped up with more capacity. But even after giving it a boost of 2 units, performance is abysmal.
Simply eyeballing the process, it look to me like it is taking six to eight seconds more than it should. And that is what Bubble is reporting in its debugger warnings. This is after boosting, and it’s consistent with slow response I saw without the boost.
Here’s the capacity charts for the last six hours, the timeframe in which I’ve done most of my testing today:
I’ve been told that “Make changes to” is much faster than “Create a new”, so I set up the database and workflows to prime the database by creating a new, empty (model) thing in an API workflow. Then, rather than creating the thing when the user needs it, the process is to grab the model thing and change it to reflect what the user thinks is a new thing. Then, the workflow triggers the API to create another model thing. (In reality, I’m keeping three model things available per user so even with sluggish performance, there should always be at least one available for the user.)
What am I not understanding about performance and how to speed up database updates?