5 replies
  • WEB DESIGN
  • |
Hey guys I have 404 error codes growing into the thousands and I am quite sure they are affecting my SERPS. Can i put these links into my robots.txt file?
Do i need to but a code infront of them?

Thanks,

Matt
#404 #codes #error
  • Profile picture of the author JezWebb
    If your server is returning a clear 404 error for any missing pages, then these pages should be deindexed over time.

    I'd find out where the requests for these pages are coming from. If from internal links - fix the links. If external, consider a redirect using .htaccess so you keep to get link juice for these external links.
    Signature

    Web strategist and adventurer. Director of Search Engine Friendly Hosting. ---Fast, affordable, feature packed hosting from £1.97. --- Search Engine Friendly Web Hosting

    [VIDEO] How to set up a WordPress blog in under 10 minutes.

    {{ DiscussionBoard.errors[6411925].message }}
  • Profile picture of the author SteveSRS
    Hi,

    1. Check where 404's are coming from and what content they are looking for
    2. Redirect request to pages with content they are looking for (or close to it) use 301 via htaccess
    3. If you don't have content which they are looking for make a good looking 404 page (always do this, and make sure it sets 404 status in header) and first make clear it is a not found page secondly (a huge trick sooo many people forget) offer your best content items (summaries + links) to convince visitor to stay!

    Seriously one could make a WSO from these steps.. one of the most forgotten techniques for webmasters and soooooo important.
    {{ DiscussionBoard.errors[6411954].message }}
    • Profile picture of the author mattchart
      We decided to translate our website into 30 languages to help us rank in international search engines and we used a language plugin GTS Translation. We then took a sitemap of almost 100,000 links and submitted it to google for indexing. Now that google is indexing our site, we are getting alot of broken links that seem to mostly be coming from this plugin.

      We did design a 404 redirect page so what you guys are saying is to take all these links and put them in the .htaccess file to send them to the redirect page?

      I did take about 2000 of the 8000 broken ones and put them into the robots.txt file and webmastertools did register fewer broken links the next day. We did have a positive ranking jump when I placed some of the links in the robots.txt file the next day as well, although it could be co-incidence as we did have some backlinks register that day as well. Should this not be the best solution to just put them all in the robot.txt?

      I used this format in the robot.txt

      User-agent: *
      Disallow: /mybrokenlink1.com
      Disallow: /mybrokenlink2.com
      ect.

      Will the .htaccess file by redirecting also remove the errors since they are being re-directed.

      Arn't both these methods basically the same? Why would I want to redirect them vs disallow them to be indexed?

      Thanks In Advance!
      {{ DiscussionBoard.errors[6432417].message }}
  • Profile picture of the author Microsys
    Why not redirect your broken links to more appropriate content? Or fix what is causing the broken links?
    {{ DiscussionBoard.errors[6434495].message }}
    • Profile picture of the author wtd1
      Originally Posted by Sitemapper View Post

      Why not redirect your broken links to more appropriate content? Or fix what is causing the broken links?
      Why do you have links that point to pages that aren't there?
      Solution is to either fix the links so they point to a page or remove them.
      {{ DiscussionBoard.errors[6435607].message }}

Trending Topics