How To Avoid Duplicate Content?

22 replies
  • SEO
  • |
I understand that Google penalizes sites for "Duplicate Content". My question is, does Google penalize for duplicate content for pages like: Disclaimer, Privacy Policy, Terms of Use, About Us, and Contact Us?

If someone has several sites, all with these pages, chances are good that the content is going to be essentially, if not exactly, the same on these pages from site to site.

Do these need to be "Spun" like articles so the content is different, or do these pages need to be Do-Not-Crawl, or does Google somehow know what they are and that they are bound to have duplicate content and give sites a pass for it?

Thanks.
#avoid #content #duplicate
  • Profile picture of the author Mosa
    Google ranks the pages separately. Your privacy page, contact page, etc will not affect your other pages.
    Signature
    {{ DiscussionBoard.errors[7015941].message }}
  • Profile picture of the author MikeFriedman
    What Google penalizes is the same content appearing over and over again on the same domain. You don't have to worry about any issues with your privacy, disclaimer, etc. pages.
    {{ DiscussionBoard.errors[7015949].message }}
  • Profile picture of the author Nelapsi
    Just put noindex in those kind of pages, takes care of all the issues for you
    {{ DiscussionBoard.errors[7015954].message }}
  • Profile picture of the author lMlariaVC
    If you use a plugins like "All-in-One" SEO, you can input pages you don't want indexed in that particular plugin's settings. Also there are many plugins that create generic privacy policy and terms of use so I would think that these types of pages are not huge "site wide" ranking factors in terms of duplicate content. Although that's just speculation.
    Signature
    .
    FREE Video Course Reveals How To Start A Successful Online Business in 24 Hours!
    .
    Click here to gain access!
    {{ DiscussionBoard.errors[7015976].message }}
  • Profile picture of the author yukon
    Banned
    Like already said, it's no big deal unless the duplicate pages are on the same domain.

    Usually those types of pages get buried in the SERPs.

    Disclaimer, Privacy Policy, Terms of Use, About Us, and Contact Us
    They might rank when the site is new, but I've always found they slowly drift away in the SERPs on their own.
    Signature
    Hi
    {{ DiscussionBoard.errors[7015983].message }}
  • Profile picture of the author TDogger
    UMS, that is actually a very old Google blog post. Panda pretty much changed a lot of things. David has it right. It is bad to copy content from the web and put it on your web site. Google doesn't officially call a duplicate content penalty a "penalty" because it is a filter. When the filter detects a page using content taken from other sites, it reduces the rank position for that page. It only affects the entire site if you have too many of those pages, such as with autoblogs.

    While Google doesn't officially call it a penalty, if it reduces the rank position for the duplicate content page, and it walks like a duck and quacks like a duck, ...

    David, if you are concerned about Disclaimer, Privacy Policy, Terms of Use, About Us, and Contact Us pages that you copied from other sites, either add the robots nofollow meta tag to those pages (as mentioned above) or use the robots.txt file to block those pages from spiders. I have been doing SEO since 1997 and I use both on my sites.
    {{ DiscussionBoard.errors[7016077].message }}
    • Profile picture of the author davidf515
      Thanks UMS, for that informative article. I'm learning more every day.
      Actually the article DOES address this.

      I see that there are essentially 2 kinds of duplicate content: 1. Actual duplicate content - articles or other info copied from other sites and put on yours, and, 2. Multiple URL's on the same site all pointing to the same content. It is the first type that I was afraid would happen to my Disclaimer, etc pages.

      To quote the article:

      There are some penalties that are related to the idea of having the same content as another site--for example, if you're scraping content from other sites and republishing it, or if you republish content without adding any additional value. These tactics are clearly outlined (and discouraged) in our Webmaster Guidelines.

      This is what I was afraid Google might assume having seen the exact same content (Disclaimer, etc.) on several of my sites.

      As for the blocking them from the robots, I have also read that Google really likes to see these types of "Consumer Info" pages because it makes for a more trustworthy site and a better experience for the viewer. And it's usually a good thing to make Google happy.
      {{ DiscussionBoard.errors[7016159].message }}
    • Profile picture of the author UMS
      Originally Posted by TDogger View Post

      UMS, that is actually a very old Google blog post. Panda pretty much changed a lot of things.
      Panda has changed a lot of things, but the duplicate content myth hasn't changed.

      Google is always going to rank what it considers the original source of the article higher than any duplicate copies, but it would be wrong to say that Google penalises a site because they use duplicate content.

      Just think of all the news websites that use syndicated (which by definition is duplicate) content.
      {{ DiscussionBoard.errors[7016198].message }}
      • Profile picture of the author Nelapsi
        Originally Posted by UMS View Post

        Panda has changed a lot of things, but the duplicate content myth hasn't changed.

        Google is always going to rank what it considers the original source of the article higher than any duplicate copies, but it would be wrong to say that Google penalises a site because they use duplicate content.

        Just think of all the news websites that use syndicated (which by definition is duplicate) content.
        And you are going to tell me Google does not give special consideration to larger more established authority sites then the small guy? I say noindex it and not worry about the whole issue nor take a chance, it is simple and effective.
        {{ DiscussionBoard.errors[7016223].message }}
  • Profile picture of the author project123
    You have to take it all into context now if duplicate content got you penalised we would see all news sites banned from the net. What you must do to succedd is provide relevant and quality content above all else
    {{ DiscussionBoard.errors[7016133].message }}
  • Profile picture of the author Eugeanne
    I do agree on Nelapsi, putting a no index tag will help to avoid duplicate content on each pages.
    {{ DiscussionBoard.errors[7017060].message }}
  • Profile picture of the author seomanchun
    To hire the content writer to write a content for your blog, or use some paid tool to rewrite content but paid tool rewrite content but meaning change and also more grammatical error in the content so that according to my knowledge to hire the content writer to write a content that is best way to promote your blog in the search engine and also get drive more traffic on your blog.
    {{ DiscussionBoard.errors[7017194].message }}
    • Profile picture of the author davidf515
      I guess I'm just seeing a Catch-22 here.

      Say I have 20 sites, each with its own unique content. In addition, each also has the usual "must-have" pages: Disclaimer, Privacy Policy, Terms of Use, About Us, and Contact Us. Well, I'm probably going to want to use the same generic content for these must-have pages across all of my sites.

      So now I have 20 sites, each having, among their unique content, 5 pages of identical content to the other 19.

      On the one hand, Google really likes to see these pages on sites. At this point, they are pretty much mandatory, or so I've been told. So, I certainly don't want to hide them from Google.

      On the other hand, all this duplicate content seems bad to me, so I make them "Do Not Crawl" or "No Index", but that hides them from Google - not good.

      Maybe I need to just go in and "Spin" this dupe content on each site.

      Maybe I need to just hide them from Google.

      Maybe I need to just not worry about it and go earn some money.

      Thanks to all for your input.
      {{ DiscussionBoard.errors[7023526].message }}
  • Profile picture of the author mare
    I'm trying to find the correct answer on this topic for some time now and always get different answers. I still don't know whether to Index these pages (About, Contact, Sitemap, Legal pages) and if not, which meta tag should I use for them, "noindex,follow" or "noindex,nofollow"???

    I noticed that many people are confused about this, so I don't understand why Matt Cutts does not give a straight answer to this question. I just assume that he is not answered yet, otherwise someone would probably already put a link to this answer. And I don't mean only in this forum.

    Marko
    {{ DiscussionBoard.errors[7029999].message }}
  • Profile picture of the author maskpeterson
    because duplicate content is making bad over site and when Google panda update that time
    {{ DiscussionBoard.errors[7030161].message }}
  • Profile picture of the author addedsoft
    I think you don't need to avoid this pages.
    {{ DiscussionBoard.errors[7030988].message }}
  • Profile picture of the author CyborgX
    There's no duplicate content penalty, really, especially for that.
    I would imagine any site like yours has similar descriptions.

    I've even copied amazon's for my pages. Slightly different
    than what you are doing, but for similar items, amazon has
    no choice but to use the same description across the board,
    or at least a good deal of it.

    They sell DVD, DVD/Blu ray combo, Blu Ray, Special Package, etc.
    of popular movies. They all have the same description, mostly,
    except for what's in the bundle.

    I think the top 2 games on amazon are LA Noire. But for
    a different system. Game description is all the same.
    {{ DiscussionBoard.errors[7104667].message }}
  • Profile picture of the author lauragibbs83
    As far as I know there is a tool online that can check duplicate content.
    {{ DiscussionBoard.errors[7109480].message }}

Trending Topics