Why isn't google grabbing my robot.txt?

by vivo
0 replies
  • SEO
  • |
I had problems with Google indexing my site, but I was able to get most of it indexed without a problem. Now it is listing these errors again: URL timeout: robots.txt timeout when I allow anyone to use it (666 or 777 rights) or
Network unreachable: robots.txt unreachableWe were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely if I leave it as it is default (I think 644). What is up with this? I had no problem before with them. Is it the format of my sitemap? I thought maybe it could be my forums spiders getting in the way, but it said about the main directory, and when I tested eliminating the spider, it did not change. Whats up? site: http://www.mildaspergers.com/sitemap.txt
#google #grabbing #robottxt

Trending Topics