11 replies
  • SEO
  • |
Suppose if there are two websites A and B. A having content copied more than 30% from B, then is it possible that site A get banned by Google or what is the percentage that Google allow means either it is less than 30% or 20% etc. Which is the best tool to check Content Duplication.
#content #duplicate
  • Profile picture of the author rasel786
    I think copyscape.com will help you to check Content Duplication.
    {{ DiscussionBoard.errors[5970430].message }}
    • Profile picture of the author C Rebecca
      As such there is no percentage or ratio of declaring content as duplicate. If content is appreciably similar for the bot, it will be declared as duplicate. And, any of the two websites may get declared as duplicate. the one which got indexed earlier may get some benefit of original content. But there are many other factors which are considered before declaring a website duplicate.
      Signature

      FREE 30 minutes of Ecommerce Marketing consultation. Consulted clients like Overstock.com, About.com, Lowe's and more...
      Book at: hello@techzui.com

      {{ DiscussionBoard.errors[5977945].message }}
      • Profile picture of the author samual james
        Originally Posted by C Rebecca View Post

        As such there is no percentage or ratio of declaring content as duplicate. If content is appreciably similar for the bot, it will be declared as duplicate. And, any of the two websites may get declared as duplicate. the one which got indexed earlier may get some benefit of original content. But there are many other factors which are considered before declaring a website duplicate.
        Definitely there is not any percentage that stats you can copy 20 or 30% of the content. try to generate unique content for your site if you want to get better serp and traffic.
        {{ DiscussionBoard.errors[5986085].message }}
        • Profile picture of the author Guru SEO
          If I was Google the Algo would go like this

          Content first index +10
          Normal Google Algo +5

          If you scrape content that was already indexed , it would require two times more SEO juice than usual to rank your page.

          You can test this with an article post at a directory, let it get indexed and then see how much juice you need to overcome it.

          That is why you should usually post on your site first and then post to article directories.
          {{ DiscussionBoard.errors[5986247].message }}
  • Profile picture of the author StanTman
    Yes Copyscape is the best. But you have to buy credits at 0.05$ / search.
    {{ DiscussionBoard.errors[5971270].message }}
  • Profile picture of the author janita
    I'm using copyscape from long time its pretty great, try that!
    Signature

    Get IT Certifications With the help of http://www.certshelp.com/ Study Material to boost your career.

    {{ DiscussionBoard.errors[5971297].message }}
  • Profile picture of the author Ant B
    The same content on different domains does not receive a duplicate content penalty. However the better, older SEO optimised site will be preferred by the search engines and will rank higher than the second site with the same content. Note that this is not a penalty, it is a decision made based on criteria where one site is better than the other in some way (backlinks, bounce rate, domain age etc...)
    {{ DiscussionBoard.errors[5973661].message }}
    • Profile picture of the author RyanLB
      Originally Posted by Ant B View Post

      The same content on different domains does not receive a duplicate content penalty. However the better, older SEO optimised site will be preferred by the search engines and will rank higher than the second site with the same content. Note that this is not a penalty, it is a decision made based on criteria where one site is better than the other in some way (backlinks, bounce rate, domain age etc...)
      This is true to an extent in my experience, but Google doesn't always get it right. They have gotten a lot better at identifying the owner of the content though. It has been awhile since I saw someone straight scrape my content and republish it, so I can't give any recent notes on what I've seen. It used to be a pretty common occurance.
      Signature

      I'm a Freelance Copywriter that helps Agencies, Startups and Businesses Educate Their Audience and Grow Sales
      Skype Me: r.boze
      {{ DiscussionBoard.errors[5973712].message }}
  • Profile picture of the author lindasalesglobol
    CopyScape and Plagium are the best of the lot and works great!
    {{ DiscussionBoard.errors[5985157].message }}
  • Profile picture of the author ghazia
    In that case Google considers domain age, the cache copy of the content which was published earlier.
    {{ DiscussionBoard.errors[5986306].message }}
    • Profile picture of the author iliana
      I don't think that Google gives so much weigh on percentages.

      As Matt Cutts said (Google's head of webspam) Google cares about the relevance and the better content, so in case of duplicated content Google will rank higher the site with the better content overall.

      But, don't forget that there is an upcoming Google update that will change all of these, because now GoogleBot will understand unique content and decrease the rank of sites with duplicated content.
      {{ DiscussionBoard.errors[5991017].message }}

Trending Topics