How to fix crawler error ?

9 replies
  • SEO
  • |
My website has 119 server error (code : 500) and not found 201 (error code:404). What should i do. should i collect all of them and disallow them on robots.txt file or any other way to fix them? Please let me know someone.
#crawler #error #fix
  • Profile picture of the author plusdollar
    I believe the spider is crawling your site but found some internal links that does not exist. You may want to find out where these bad links are and make them good, else it will be frustrating for your visitors as well when they click on the bad links.

    try this:
    Free Broken Link Checker / Online URL Validator - finds bad / dead weblinks
    {{ DiscussionBoard.errors[7822366].message }}
  • Robertarticle,

    You would be better off redirecting those pages to pages that are working. You can view detailed reports in webmaster tools about which links are causing issues and you will be able to see exactly where the links are. You can then either fix them or redirect them to working pages.

    Here is info on how to deal with them as well,

    404 (Not found) - Webmaster Tools Help

    Best,

    Shawn
    Signature
    Outsource to the experts...

    We customize your Blog, eBook, Press Release and Sale Copy content with your message.

    {{ DiscussionBoard.errors[7822877].message }}
    • Profile picture of the author paulgl
      There's no reason to redirect. Just get a custom 404 error.

      It does not matter, unless those urls are ones that are
      SUPPOSED TO BE THERE!

      It's a common occurrence, and no reason for anyone to
      panic.

      If you actually coded the site yourself, and did not use
      some lame CMS, like wordpress, there would be no
      problem anyway.

      It's time for people who want to be webmasters, taking
      the bull by the horns, and actually becoming webmasters!

      Learn something!

      Paul
      Signature

      If you were disappointed in your results today, lower your standards tomorrow.

      {{ DiscussionBoard.errors[7822900].message }}
  • Profile picture of the author robertarticle
    If i block all the crawler error on robots.txt file. Is it a good practice?
    {{ DiscussionBoard.errors[7823669].message }}
    • Profile picture of the author gladwinforum
      Originally Posted by robertarticle View Post

      If i block all the crawler error on robots.txt file. Is it a good practice?
      Hmm that's not a problem.. you can do that.. Or just remove those broken links, this will be a good one..
      {{ DiscussionBoard.errors[7824502].message }}
  • Profile picture of the author jovykhan
    Simple. Just select all and click mark as fixed.
    Signature
    LocalFinder.net Australia's Leading Online Business Directory
    Australian Local Citation Service
    {{ DiscussionBoard.errors[7824557].message }}
    • Profile picture of the author gladwinforum
      Originally Posted by jovykhan View Post

      Simple. Just select all and click mark as fixed.
      Haa don't you know it shows the error again.. This error will be seen till its fixed.
      {{ DiscussionBoard.errors[7824571].message }}
      • Profile picture of the author jovykhan
        Originally Posted by gladwinforum View Post

        Haa don't you know it shows the error again.. This error will be seen till its fixed.
        From my experience I had a site that has thousands of 404s all I did was marked them as fixed even without actually fixing them. They never showed up again in GWT.
        Signature
        LocalFinder.net Australia's Leading Online Business Directory
        Australian Local Citation Service
        {{ DiscussionBoard.errors[7824607].message }}
  • Profile picture of the author gladwinforum
    What you have done is not for good.. You should understand that first.. The alert sent by WMT is not the matter, it has to be fixed else you suffer.
    {{ DiscussionBoard.errors[7824691].message }}

Trending Topics