Googlebot unable to access robots.txt

5 replies
I am receiving the following error from google webmaster tools

Network unreachable: robots.txt unreachable
We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.

Has anyone ever received this error? MSN and Yahoo have no trouble crawling my site. My robots.txt is not blocking googlebot.
#access #googlebot #robotstxt #unable
  • Profile picture of the author webdesigners
    Please check that your robot.txt file is chmod with 777 permissions.
    {{ DiscussionBoard.errors[738187].message }}
  • Profile picture of the author Jeremiaho
    I remember reading somewhere that the problem could be with your web host's firewall settings. Check with your host, they may need to allow a higher threshold of concurrent connections.
    {{ DiscussionBoard.errors[754521].message }}
    • Profile picture of the author Devilfish
      The robots.txt file doesn't need to be chmod 777, 644 is sufficient.

      Check that there aren't any windows characters in the file. Edit it with a unix editor or using your FTP clients. If necessary, create a new file and see if that helps.
      {{ DiscussionBoard.errors[754552].message }}
  • Profile picture of the author rootpixel
    Use FileZilla (free FTP file transfer program) and CHMOD with 777 permissions. Try to enter your website URL, followed robots.txt.
    {{ DiscussionBoard.errors[754631].message }}
  • Profile picture of the author ganeshseg
    use smart FTP and change the permission of the robots.txt file to 644
    {{ DiscussionBoard.errors[755952].message }}

Trending Topics