You may want to look into the Data API if you simply need to load the data – it’s much faster than loading each row 1 at a time (probably 100x faster).
We use the Data API to load 500 or so records in under a second. Seems like you could, essentially, paginate the 1,000,000 rows into groups of, say, 500 and pass that in using the Data API. Note, the Data API loads all of the data at once, instead of loading each item one at a time (i.e., running on a list of things). So, it’s much better at dealing with large sets of data.
I do believe it times out after a couple seconds, so that’s why you’d want to paginate it. We’ve found it consistently works for 500 rows, and didn’t test larger numbers.
This reply doesn’t seem all too professional or appropriate…
I’m trying to run a bulk action on on a list of 1,000,000 pieces of data but bubble can’t handle it under my current plan.
A Bubble bug report was submitted and support directed me to the forum to find a way to fire 1,000,000 api calls over time using a script (or other tool) to have the bulk actions completed without bubble having to parse the entire list (or me upgrade to a dedicated plan).
I am now enquiringly in the forum as to how to best go about doing that.
It will be sending off 1,000,000 or so unique ID’s for things loaded into the bubble database already and making changes on those things when it hits the endpoint.
My question here is trying to determine the best way to fire those 1,000,000 unique id’s off at the api endpoint using a tool or a script.