Crawl postponed because robots.txt was inaccessible

2 replies
  • SEO
  • |
Hi can anyone now how can i fix this error coming up in google webmaster tools?
my site is Internet Marketing Leaks - WSO's Product Review and Discount
#crawl #inaccessible #postponed #robotstxt
  • Profile picture of the author nest28
    It's nothing to worry about.
    {{ DiscussionBoard.errors[6277511].message }}
  • Profile picture of the author cheapstuff
    Actually it can be a big issue. One of my sites was not getting crawled because the hosting was making the robots file unreachable. Since Google knew I had a robot.txt file they did not want to crawl my site in fear of breaching my set rules in the robot.txt file.

    So after a bit of not crawling the site dropped off of the search engine. I had to change hosts because they could not resolve the issue... although I had a few accounts with the same host that had no issues.
    {{ DiscussionBoard.errors[6277589].message }}

Trending Topics