Duplicate Content In A Post Panda World

6 replies
I know there is a lot of frequent discussions on duplicate content and syndicated content especially Post Panda.

Below is an article that you may find useful.

Duplicate Content in a Post-Panda World | SEOmoz
#content #duplicate #panda #post #world
  • Profile picture of the author lotsofsnow
    Thank you for the article.

    Synopsis:

    No duplicate content and you are good.
    Signature

    Call Center Fuel - High Volume Data
    Delivering the highest quality leads in virtually all consumer verticals.

    {{ DiscussionBoard.errors[5317241].message }}
    • Profile picture of the author yourreviewer
      Originally Posted by hpgoodboy View Post

      Thank you for the article.

      Synopsis:

      No duplicate content and you are good.
      I don't necessarily belive no duplicate content you are good. Because you can still have unique content that is crappy.

      From a personal standpoint, I always look for relevant high quality content.

      And if that high quality content happens to be a ezine article, I will publish it, if it happens to be an article written by an expert, I will contact them and seek their permission to reprint it and if it is something that I can write or outsource even better.

      But my focus is on relevance and high quality than unique and original.

      Here is a video that talks about how having unique content may still not suffice.

      How Google's Panda Update Changed SEO Best Practices Forever - Whiteboard Friday | SEOmoz
      {{ DiscussionBoard.errors[5317341].message }}
  • Profile picture of the author tpw
    Originally Posted by tpw View Post

    Google does not live or die based on "unique content". They live or die based on the "quality of their results" as perceived by the majority of their users.

    If "unique" was the leading factor in their search results, spam blogs full of spun gibberish would win the day. But it doesn't, does it?

    Build and promote your sites for your users, with little regard for Google. Build and promote your sites in a way that they will continue to be successful, even if Google disappeared tomorrow. Do this, and you may be surprised when you find that Google loves your sites anyway.

    Rather than to type the same thing twice, quote works just fine.
    Signature
    Bill Platt, Oklahoma USA, PlattPublishing.com
    Publish Coloring Books for Profit (WSOTD 7-30-2015)
    {{ DiscussionBoard.errors[5317377].message }}
    • Profile picture of the author myob
      Originally Posted by tpw View Post

      Rather than to type the same thing twice, quote works just fine.
      Works just fine for me too:

      Originally Posted by myob View Post

      It seldom is an issue anyway, but my marketing model is to develop syndicated outlets in a wide spectrum of niches. Writing specifically for targeted readers is resource-intensive, and efficiencies of scale are maximized in leveraging articles through syndication. With currently over 28,000 publishers in my network (and growing rapidly), writing unique articles is not cost-effective.
      {{ DiscussionBoard.errors[5317616].message }}
  • Profile picture of the author Kurt
    Google does not live or die based on "unique content". They live or die based on the "quality of their results" as perceived by the majority of their users.

    If "unique" was the leading factor in their search results, spam blogs full of spun gibberish would win the day. But it doesn't, does it?

    Build and promote your sites for your users, with little regard for Google. Build and promote your sites in a way that they will continue to be successful, even if Google disappeared tomorrow. Do this, and you may be surprised when you find that Google loves your sites anyway.



    A couple of flaws in this reasoning...First, is defining "quality". IMO, duplicate results are NOT quality results. And I'd bet quite a bit that if Google's Top 10 were filled with the same info, Google's popularity would drop.

    Then this comment:
    If "unique" was the leading factor in their search results, spam blogs full of spun gibberish would win the day. But it doesn't, does it?
    This assumes that only "spun gibberish" competes against other "spun gibberish". This isn't accurate. Spun pages must compete with all pages.


    Next, not all spun pages are "gibberish". Well-spun pages read like any other page.

    And...Spun pages use different keywords and phrases, so well-spun pages probably aren't competing directly against each other for the same keywords.
    Signature
    Discover the fastest and easiest ways to create your own valuable products.
    Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
    {{ DiscussionBoard.errors[5318046].message }}
    • Profile picture of the author tpw
      Originally Posted by Kurt View Post

      Next, not all spun pages are "gibberish". Well-spun pages read like any other page.

      You know as well as I do that the quality of the output is defined by the user of the tool and not by the tool itself.

      A spinner in and of itself is not a bad tool.

      The persons who use it generate good content IF they invest the effort into the process.

      Yet, most people don't. Most people who use spinners are lazy, and their content looks like a 3-yo translated the content from Klingon. :p

      Those who do not use the tool well crank out pure gibberish on a regular basis.

      My critiques represent those who abuse the software to flood the Internet with more trash than New York City creates in a day.

      In case you missed it, when I call spun content "gibberish", I am not talking about the kind of content that you create when you use a spinner.


      Originally Posted by Kurt View Post

      duplicate results are NOT quality results. And I'd bet quite a bit that if Google's Top 10 were filled with the same info, Google's popularity would drop.

      It is true that duplicate results in Google's search engine results would not represent quality results. Google's users would agree, and I would agree to that.

      Google does a pretty good job filtering the duplicate results out of its search results pages (SERPs), and that is a good thing.

      But the fact that an article may appear in more than one location is not a problem for most of the world. If they have read it before, they will simply skip it and go to content that they have not read before.

      The duplicate content argument is only an argument aimed at determining what is best for Google and the other search engines. And frankly, their businesses are not my business.

      If an Internet-user finds the same article on more than one site, that will only create another page for them to skip.

      But syndicated content is about reaching new audiences who missed the content when it was published in another location.
      Signature
      Bill Platt, Oklahoma USA, PlattPublishing.com
      Publish Coloring Books for Profit (WSOTD 7-30-2015)
      {{ DiscussionBoard.errors[5318089].message }}

Trending Topics