How Do I Make Search Engines Ignore Duplicate Pages?

6 replies
I'm setting up some sales pages that are all very similar so that I can tailor them to specific traffic, but it means I'll end up with lots of pages that are almost identical.

I don't want Google to see it as duplicate content, so is there a way around it? I've been looking at robot.txt options, but it looks like it's not easy to set up what I'm looking for (I don't want to risk my entire site not getting indexed).

Any ideas?
#duplicate #engines #ignore #make #pages #search
  • {{ DiscussionBoard.errors[1756513].message }}
  • Profile picture of the author innocent07
    Banned
    Originally Posted by Alex Taylor View Post

    I'm setting up some sales pages that are all very similar so that I can tailor them to specific traffic, but it means I'll end up with lots of pages that are almost identical.

    I don't want Google to see it as duplicate content, so is there a way around it? I've been looking at robot.txt options, but it looks like it's not easy to set up what I'm looking for (I don't want to risk my entire site not getting indexed).

    Any ideas?
    Place this code in your head:

    Code:
    <meta name="robots" content="noindex"/>
    {{ DiscussionBoard.errors[1756542].message }}
    • Profile picture of the author Airwalk Design
      could anyone help with a similar situation where google indexes 5 versions of my page. the page uses a javascript rating system where the code includes 5 <a>'s, and it indexs the page 5 times
      {{ DiscussionBoard.errors[1756559].message }}
      • Profile picture of the author SolarPanelGuy
        I guess you could use the canonical tag for this.

        Basically you tell the search engines which one of your many similar pages is the main one.
        {{ DiscussionBoard.errors[1756572].message }}
  • Profile picture of the author Kris Turner
    Noindex! Thanks, guys. I was sure there was something I was overlooking, but the only thingthat came to mind were nofollow and the robot.txt file.

    If I place a noindex tag in the head of all my duplicate pages, I can safely have say 10 copies of the same page without any problems?
    Signature

    {{ DiscussionBoard.errors[1757135].message }}
    • Profile picture of the author imintern
      You can also create a file called robots.txt and put it in the root of your web site. Using this you can prevent search engines from indexing specific files and folders.

      Here is an example of a robots.txt file -

      Code:
       
      User-agent: *
      Disallow: /cgi-bin/
      Disallow: /private/
      Disallow: /thank-you.html
      Disallow: /download.html

      In this example the directory "private" and the web pages thank-you.html and download.html have been prevented from being indexed. You can protect any number of files and folders.

      Suggested Reading:

      Code:
       
      http://www.robotstxt.org/



      Edit
      oooppps! Sorry, seems I didn't read your post fully. however, I will still let it stay here hoping anybody reading the thread would benefit from the info.
      {{ DiscussionBoard.errors[1757229].message }}

Trending Topics