Got message from google

4 replies
  • SEO
  • |
Warrior Experts,

Please solve my issue..

I got direct message from Google in GWT like this..

Over the last 24 hours, Googlebot encountered 39 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 30.5%.

I really don't knw how yo fix this error.. and, I have little fear about my performance of my website due to this..

And, I planned to disallow some URL's in my robots.txt (those URL's which shown in Crawl Errors section)

Please give me some valid suggestion to get rid out of this issue..!

Thanks in advance,
Steffy
#google #message
  • Profile picture of the author adwordsmac
    post your robots.txt
    {{ DiscussionBoard.errors[7683226].message }}
  • Profile picture of the author adwordsmac
    put a # in front of the last line
    {{ DiscussionBoard.errors[7683260].message }}
  • Profile picture of the author Punit12
    Originally Posted by Steffy Rose View Post

    Warrior Experts,

    Please solve my issue..

    I got direct message from Google in GWT like this..

    Over the last 24 hours, Googlebot encountered 39 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 30.5%.

    I really don't knw how yo fix this error.. and, I have little fear about my performance of my website due to this..

    And, I planned to disallow some URL's in my robots.txt (those URL's which shown in Crawl Errors section)

    Please give me some valid suggestion to get rid out of this issue..!

    Thanks in advance,
    Steffy
    Dear Steffy,

    As per your robots.txt file You have this code:

    User-agent: *
    Disallow:

    What it says?
    Remove site from search engines and prevent all robots from crawling it in the future.

    This is the problem.

    How to fix it?
    Instead of your existing robots.txt code use below,

    HTML Code:
    User-agent: *
    Allow: /
    Disallow: /ABC-PQR.htm


    You can block the pages that you dont want to crawl simply by adding above mentioned sample Disallow: command.

    Signature

    Punit Kansara
    Seo Strategist

    {{ DiscussionBoard.errors[7686661].message }}

Trending Topics