Is there some sort of throttle on the csv import? I’ve got two fields (one text, one address) and validated about 350 rows and then times out. if i restart it does one row and times out. If i wait after a timeout for a few minutes it does 6-7 rows very fast then stops and times out. I am guessing there is some extra work due to it being an address and that is causing the issue?
Now getting this. Not sure what is throwing that, what the limit is or how i can get around it (say with my Google maps account if it is pinging that)
@jim1 One option is to import as text then run a schedule API workflow to save to an address field with error handling for when the address format is considered incorrect by Google maps/geocoding. For an immediate fix, remove or modify the address and try reuploading.