Duplicate content doesn't really matter?

by 19 replies
22
Hello all,

I conducted a little expiration a few weeks ago in response to a question I received about PLR and duplicate content.

Basically, I added the same article to different blogs and waited to see if they would be indexed.

I used a key word that I made up so I could track the progress.

I found out that the first site that lists the article will reach the top of Google. Actually, the rest are ranked by posting time as well. I tested a few different things. I changed the words of one article by 10% and it ranked well. I changed another by 20% and it ranked even better. It was interesting that only changing the words allowed the page to rank.

Every page I created was indexed by Google. None of the pages were placed in the supplemental search or anything like that.

I'm not saying that this was a perfect experiment since there was not competition for the key word, but it was interesting.

I'm still looking into the results and I'll let you know what I find. Also, I'm thinking of doing a few more experiments to see what happens.

Does anyone else have any ideas about this topic?

Thanks,

George
#search engine optimization #content #duplicate #matter
  • Sounds like a good test George. I've been doing some testing myself with unmodified PLR lately, and have found similar results. All my pages are still indexed even though I know that there are many other sites out there using the same content.

    The one thing I will add is that I believe the your ranking is not just based on when the content was posted. It's also based on any backlinking and pagerank that you have going. I'm getting a good bit of search engine traffic, mostly from Google, and am ranking highly for some keywords, and the other sites that are using this same content are nowhere to be found.

    I think this is mostly due to the very little bit of backlinking work that I've done for my site versus those that have done none. Also, keep in mind that I've done very little work on backlink building and am still getting good results. I'm working now with outsourcing some of my backlink building and expect to start seeing even better results soon.

    Keep us posted on your results.
  • It's an interesting thing to see really. I never conducted any such experiments myself.

    A general belief is that the duplicate content page won't be indexed. :confused:
    • [1] reply
    • This issue is definitely important to be tested.

      My current rule of a thumb is be unique if possible, if not possible - rely on luck.
  • i just built a blog with articles i published on ezinearticles about the only thing i did was change the title a bit. I then submitted the feed to feedagg.com and now the feedagg pages rank right below my ezinearticles on Google and the content is 100% duplicate.
  • I was wondering does it matter if we use articles from ezinearticles on our blogs? I mean I know we can use them if we keep the name and everything, but I mean like can we adjust it to place ads for adsense or do we have to keep it the same without changing the format?
  • Duplicate content doesnt really matter if its on a seperate IP. Duplicate content on the same domain IP does matter.
    • [ 1 ] Thanks
    • [5] replies
    • I've readed from experts -some of them Google related- duplicate content filters and Google Sandbox are only Hoaxes (BS).

      Plus, my common sense tell me could be very difficult for Search Engines to rank sites with duplicate content criteria, because is no logical to pretend, in the editorial business, that each piece of information to be unique.

      The mean concept of the Web is to share.


      Good information is repited again and again.

      Good products are selled for many people, and the marketing materials are all identical.

      News are copy & pasted thousand of times a day, just for the most popular and reliable sites on the Net.


      About the IP issue, I'm not sure at all.

      Most sites use sharing hosting, and very few purchase an IP.

      In these circuntances, hundred of sites share their Ips.


      Moreover, if you have a couple of business, you can purchase an individual IP por each one, but if your model is, for example, to create niche adsense sites -they are what more needs the organic SEO- you won't purchase an individual IP for each of your 400 sites, will you?

      Plus, I can read many success histories of Adsense guys that even don't purchase a new domain for each site, but put all of them in directories of the same domain.



      Conclusion: I believe there are too many miths in the Internet Marketing arena; many for to fear people and get them to purchase new products an services.

      But only one thing is actually true: what testing tell us.



      Just my 2 cents ;-)


      Leonardo
    • That's my take on it also...

      btw Bob, your avitar confuses me every time I see it... I'm not sure what to do , laugh or shriek...

      - Jared
      • [1] reply
    • 100% TRUE!

      Also, the first one to publish the content will be ranked 1st for it so always get it indexed on your site before anyone else (EZA).

      Louis
    • Who ever told you that is in a world of their own.

      If you had say 50 places to post the same article to, sure, the back links work ok, but ONLY one of those 50 articles is going to show up in the SERP's at any one time, and randomly select another article when the query is run again that found the article the first time. plus only 16 or so of those duplicate content posts will be shown at any time time.

      What would you rather do, the above, or create 50 unique articles, and have those articles come up in the SERP's for the same phrase?

      Any backlinks coming from the later, will have more power than coming from heaps of links that contain duplicate content.

    • Have you ever test it? I have ever test it using blogspot.com and wordpress.com different account, and there's nothing happen just like what the myth said. Infact, another blog that copied all the content from other has a better position than the original one due to backlink quantity.
  • Duplicate content is when you have the same article on your OWN site.

    If you read Josh Spaulding blog, he recently made an experiment, and it wa positive.

    Franck.
  • I use "a lot" of duplicate content, in fact to the tune of "millions" of pages. My experience it takes at least 3-6 months for duplicate content to do their work. But most of time, they do get through in the end.

    The pages may still be in Google's index, but they will only show up with the narrowest of searchs in phrases e.g. "10cc OS model helicopter engine with muffler" whereas before, it will show up without the quotation marks.

    As to whether the original published page remains always indexed and ranks well. This is what we want Google but my experience is that it is not always the case, especially if the content is put up only a short time apart.


    Derek
  • I recently dd some experiments with ezinearticles posted on my blogs and submitted to RSS feed sites. For a few weeks the blog pages and rss feeds ranked well, especially feedagg but as time went on the disappeared.

    The time it takes to alter and article so it will stick around for years is worth it to me. The easiest way rarely is the best way
  • it may work now but the bigger question is will it last over time? Google and the others get smarter everyday!
  • Exactly right Captivereef. I've gotten the exact same article (title and content) in my Google Alerts (on the same day), both being indexed. However, over time, Google seems to be able to filter out the duplicated one. A lot has to depend on which one is indexed first, but also which one you build backlinks to.

Next Topics on Trending Feed