Google Sandbox - My Theory
I am in the process of setting up a set up about 18 new domains for a Christmas promotion. I added them to my Google Webmaster Tools account and submitted sitemaps for each of them. After waiting several hours for Google to process them, many of them would show a status of "URL timeout: robots.txt timeout" (or something like this -- I can't remember the exact phrasing). This was strange, because most of the domains were set up exactly alike regarding sitemaps and robots.txt files.
A search on Google revealed that this is a fairly common problem, and it means that the Google bots are being unable to access your site. If you do a search on "URL timeout: robots.txt timeout" and look for a search result that goes to Google Groups, you should be able to find the explanation that I found.
The problem turns out to be that a web host ends up blocking IP addresses that happen to be Google bot IP addresses. It's unclear how this happens, but some of the investigation I did made it sound as if automated security software on the web host's computer might end up blocking IP addresses for some reason. Another thing I found complained about someone getting the "URL timeout: robots.txt timeout" every Christmas. Perhaps with the flood of new web sites around Christmas, the Google bots end up flooding web host machines with traffic and the host machine's security software interprets that as an attack. So the security software blocks the IP addresses. I suppose a similar effect might occur at other times of the year, though with less frequency.
When this problem has occurred for people, apparently they see their search position drop suddenly and precipitously, and then their listings in Google go away, if the problem remains unresolved.
This looks very much like the Google Sandbox effect, so I wonder if it could be one cause. With the problem I just recently got resolved, other people on the same shared web host machine might not have realized that their lack of being listed in Google was because our shared host machine was blocking Google bot IPs. But now that I opened a trouble ticket and got it resolved, perhaps their web sites will show up in Google (In our case, there was actually one Google bot that was not blocked, so a few of my domains were being successfully accessed by the bot, so others on the same machine might not actually have been de-listed in Google. But I wonder if they might have had erratic problems with their listing disappearing sometimes from Google perhaps?).
Anyway, if you are getting "URL timeout: robots.txt timeout" errors in Google Webmaster Tools and you can't find anything wrong with your robots.txt file or you don't even have one of these files, this might be your problem. Also, if you seem to be in the Google Sandbox, you might consider checking if this explanation could be the reason your domains are not listed.
-
NewBeing -
Thanks - 1 reply
{{ DiscussionBoard.errors[239910].message }}-
faverr -
Thanks - 1 reply
{{ DiscussionBoard.errors[239935].message }}-
NewBeing -
Thanks - 1 reply
{{ DiscussionBoard.errors[240402].message }}-
faverr -
Thanks
{{ DiscussionBoard.errors[240453].message }} -
-
-
-