I built a Keyword Rank Checker and Competitive Analysis tool
It's awesome, it grabs up to 800 ( 80 pages ) of google search results for a term.
I also added a bulk feature and have been testing getting results for 50 keywords at once ( 100 results - first 10 pages ) ...anyways
I am using proxies for this,
Its a self hosted script that does not require a database, it saves all data in directories.
The purpose is to be able to drag it into any website I run and perform my research without having to actually do any database setups.
I may sell this script as a 1 time purchase and the user ( like myself ) would be able to add XX amount of proxies... that's all well and good. no problems there.
However; I may want to setup a service with some other ideas including the current features as a hosted service like SEMrush/Moz...
It would not be exactly Data mining and creating a large accessible database.
More so users performing their searches and saving their data so its accessible to them later.
Does anyone have ideas as to how many proxies / users ratio I should be aiming to use lets say for a small scale like,
25 - 50 users, performing up to a few hundred queries a day?
I can manage cycling proxies per request and track which were used and how many times, but I don't have any number to work off of for how many total proxies I should consider to the amount of users and total queries
Currently I have been using 5 proxies to perform a minimum of 200 - 400 queries a day over the past 3 weeks and have had 0 issues with IP bans.