Yes, it is available in a workflow to then process.
So what do we do when we have thousands of rows to import?
Is it possible to schedule this as an api workflow so it doesn’t slow down the user experience? Even with less than 100 entries, the app is timing out. (only 3 columns)
Was wondering the same thing recently. Would second this request. (And open to hearing suggestions for viable alternatives).
Have you seen a demo on how to do this? or some more detailed instructions? I have been following your print screen but I am not getting the same options as you.
I am on the Hobby plan but I used the boost function to program this feature but I am not sure if it is working or not. Should this work - would i be enabled to use this functionality for an hour if I use this method?
I just want to upload a csv file into the app and save the data so it can be manipulated.
Can this limit be increased on a dedicated plan?
Yes we can talk about this via email@example.com
If you have to import as the admin, then you upload via the Editor (Live Version > App Data > Upload). If you want your Users to be able to upload thousands of rows, then a dedicated plan or higher tiered plan is your primary option.
You might also be able to redirect your users to a Google sheet or something else, and use an external workflow editor (zapier) to create via API.
Is this 100 record limitation on Professional paid plan? What about Personal paid plan? Limitations like this cause lots of extra work to try to find work-arounds. I can understand some reasonable limit, but 100 is almost useless. If not, anyone know of another solution? How would this work using the API?
I see that the limit is 200 records for Professional Plan. Way too limiting for business/real-life use. Would hope for something like 1,000 to be reasonable. Unfortunately, now I MUST find another work-around (which is a shame when they have a solution in place). Disappointing.
I assume this will not work if one of the fields you want to import is a join to another data type, as it is not a Number, text, date or address, boolean?
Thanks - Craig
I will face the same challenge, and 1k is even the minimum. @bubble It is possible to review the minimum per plan, as it doesn’t matter anymore with the actual way of processing workflows? My suggestion will be ‘no more limit’ or 10,000 for personal plan, 1M for professional, and unlimited for dedicated server.
@JohnMark Raising limits for runmode csv upload is a popular request. Since runmode csv uploads are data intensive, this might require upgraded server capacity - a change we would have to evaluate carefully.
I am running into the same issue as some people above. I need users to load an xls by themselves, which gets turned into a csv through an API, but most times they get an error, as files are usually 200-300 rows long. How can we work around this? Is there a way to get the 100 row limit increased, or can anyone point me to an api or plugin to split the CSV into various files?
Based on a recent video from @romanmg, a service called parabola.io can do this, but it starts at $9/month, so not a free solution. In Paraboloa, I was able to pass the CSV, then export it (all through an automated “flow”) thru an API Export to Bubble. Still wish Bubble would lift these low limits instead of making us search for solutions when they have things for this already. Check out Gaby’s video.
The other idea will be importing the file to Bubble (5-10 sec.) and, start a API Workflow to go througt it (counter with regex commands). I will let you know if it’s working (no reason I can see it will not). This will eliminate the need of outsiders (and faster)