I am not in the IM niche and have nothing to sell.
Any autoblog product that installs one or more files on the web site leaves a huge and easily detectable footprint. That means any product that installs code to the wp-content/plugins directory, for example, is at the mercy of the search engines.
Sure you can exclude the plugin directory with robots.txt, but who is to say that the SE's obey robots.txt when bringing a network or suspicious product down? What about the human reviewers - they can find your plugins in a heartbeat.
The search engines already have the data on which sites are using autoblog plugins. They just need the will to mine it and then bring your sites or network down.
Here is the proof
Want a list of Wp Robot installs? Here it is
Enter this search term into Google:
What about Autoblogged?:
I have only tested those three search terms so far, but that is enough for me to run a mile from any of these products, which is a shame as I was just about to buy WP Robot.
What is the solution then?
The only way to run an autoblog system that does not leave a footprint at all is to ensure that it does not install a single file into your website. You can do this in two ways:
1) Run the system entirely from a central server and issue posts to the remote sites via XML RPC type solutions.
2) Install scripts outside the public_html (on Linux systems) directory and then use cron jobs to run them.
As an ex professional PHP developer I have decided that the only way forward for me is to develop my own centralised system that posts to remote sites (it will not be commercialised). It will not install any files on those sites at all.
I didn't want to spend time developing this as I would rather have bought an existing product, but the simple fact of the matter is that there is not one autoblogging plugin product that I've looked at so far that seems remotely safe from a potentially huge search engine slap in the future.