Duplicate vs Syndicated Content

by sjohn
59 replies
  • SEO
  • |
Could someone please explain how this is different?
#content #duplicate #syndicated
  • Profile picture of the author WAL08
    I don't believe it's different. Syndicate articles are essentially taking articles that someone has already created, and redistributing them.

    Think of Re-Runs of your favorite TV shows. You can see old re runs on various networks (distribution channels), they are the same TV shows as the original.
    {{ DiscussionBoard.errors[3482983].message }}
    • Profile picture of the author JohnMcCabe
      Duplicate content is copies of content on a single site under different file names. For example, most blog platforms are programmed to create duplicate content because the same post will appear in multiple sub-directories (category, date, tags, etc.). Eliminating the effects of all that duplicate content is one of the objectives of the various SEO plugins.

      Syndicated content is more like what is described above. Content, be it article, video or whatever is used on multiple sites with the express permission of the rights holder.
      {{ DiscussionBoard.errors[3483006].message }}
      • Profile picture of the author Alexa Smith
        Banned
        Originally Posted by JohnMcCabe View Post

        Duplicate content is copies of content on a single site under different file names.
        ^^^^^ This. Exactly.

        The two are completely different.

        In terms of their location, in SEO terms, in terms of their significance, and all that other stuff.

        If it's on different sites, then it isn't "duplicate content" as far as Google is concerned. So don't think of it as "duplicate content" in your own mind, either, because for SEO purposes, and for generally "understanding what you're doing in internet marketing" too, having a conceptual/definition framework that's so out of line with Google's probably isn't really the best starting position.

        Of course, there's nothing to stop anyone from being like Humpty Dumpty in Through The Looking Glass, if they want to be like that (“When I use a word,” Humpty Dumpty said, in a rather a scornful tone, “it means just what I choose it to mean—neither more nor less”) but if you allow yourself to imagine that what's really "syndicated content" is "duplicate content" (just because you "choose to define it that way"), then it's going to draw you into all sorts of confusions, problems and misunderstandings.

        And before you know where you are you'll start believing that there's a "duplicate content penalty" for things that aren't even "duplicate content" in the first place, and all sorts of nonsense like this.
        {{ DiscussionBoard.errors[3483014].message }}
        • Profile picture of the author DireStraits
          Originally Posted by Alexa Smith View Post

          And before you know where you are you'll start believing that there's a "duplicate content penalty" for things that aren't even "duplicate content" in the first place, and all sorts of nonsense like this.
          Good heavens -- I've never seen proposed such far-fetched, ludicrous and inconceivable nonsense!



          Oh, wait ...
          {{ DiscussionBoard.errors[3483180].message }}
      • Profile picture of the author Dave Rodman
        Banned
        Originally Posted by JohnMcCabe View Post

        Duplicate content is copies of content on a single site under different file names. For example, most blog platforms are programmed to create duplicate content because the same post will appear in multiple sub-directories (category, date, tags, etc.). Eliminating the effects of all that duplicate content is one of the objectives of the various SEO plugins.

        Syndicated content is more like what is described above. Content, be it article, video or whatever is used on multiple sites with the express permission of the rights holder.
        I'm not saying that a syndicated article will get you in trouble in the SE's, but there is the possibility of getting it filtered out. If someone picks up an article from a directory, it makes sense to contact that site owner and offer a unique and expanded version of the original article. If can avoid having duplicate content, then you should. And no need to give me YOUR OWN definition of syndicated and duplicate content, Google has their own.....

        http://www.google.com/support/webmas...y?answer=66359

        Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar.
        And a tip...

        Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content
        And from Matt Cutts...

        Duplicate content question

        However, I would be mindful that taking all your articles and submitting them for syndication all over the place can make it more difficult to determine how much the site wrote its own content vs. just used syndicated content. My advice would be 1) to avoid over-syndicating the articles that you write, and 2) if you do syndicate content, make sure that you include a link to the original content. That will help ensure that the original content has more PageRank, which will aid in picking the best documents in our index.
        {{ DiscussionBoard.errors[3527834].message }}
        • Profile picture of the author Alexa Smith
          Banned
          Originally Posted by Dave Rodman View Post

          I'm not saying that a syndicated article will get you in trouble in the SE's, but there is the possibility of getting it filtered out.
          Two of the many points you've missed, Dave (and if you'll excuse the observation, they're points you also reliably miss over and over again, every single time you join in a discussion of this subject, however many highly experienced article marketers take pains to point them out to you) are that when an article gets what you call "filtered out" (and what other people call "indexed in the supplemental index rather than in the main index"):-

          (i) the backlinks in its resource-box count in exactly the same way and carry exactly the same weight as if it were included in the main index;

          (ii) the effect on the site(s) linked to in the resource-box is, of course, a beneficial effect, not a detrimental effect - and no amount of using the words "penalised" and "filtered out" can change that reality.
          {{ DiscussionBoard.errors[3527958].message }}
          • Profile picture of the author Dave Rodman
            Banned
            Originally Posted by Alexa Smith View Post

            Two of the many points you've missed, Dave (and if you'll excuse the observation, they're points you also reliably miss over and over again, every single time you join in a discussion of this subject, however many highly experienced article marketers take pains to point it out to you) are that when an article gets what you call "filtered out" (and what other people call "indexed in the supplemental index rather than in the main index"):-

            (i) the backlinks in its resource-box count in exactly the same way and carry exactly the same weight as if it were included in the main index;

            (ii) the effect on the site(s) linked to in the resource-box is, of course, a beneficial effect, not a detrimental effect - and no amount of using the words "penalised" and "filtered out" can change that reality.
            1a) I disagree, that when the S.I. results were visible that a link from there carried the same weight as a Main Index. I conducted numerous test on my own to verify this, as well as tests with other members of Stompernet.

            1b) But given that it's not visible anymore, it's irrelevant. But I have a question for you since you state, with confidence, that the links count the same. Since you don't know which pages are in the Main Index vs. The Supplemental Index, how do you KNOW they count the same? You have no idea which is the Main index and which is the S.I.

            2) I never said anything about a penalty. And I'm not using penalty and filter interchangeably. Filter is meant to describe Google's actions of removing duplicate articles from the index in favor of one page. They might count the link, but your duplicate pages are essentially out of the game when it comes to the SERPS.

            That's why if you have a relationship with the site owner, you should be giving them expanded or slightly different copies of the article.
            {{ DiscussionBoard.errors[3528301].message }}
            • Profile picture of the author Alexa Smith
              Banned
              Originally Posted by Dave Rodman View Post

              given that it's not visible anymore, it's irrelevant.
              So for backlinks to be relevant to you, they have to be "visible" to you, now, without looking in the supplemental index?! Visible to Google's no longer good enough?!

              "Interesting" new rule of SEO, there!

              Originally Posted by Dave Rodman View Post

              you don't know which pages are in the Main Index vs. The Supplemental Index
              That's completely wrong: it's actually very easy to find out which, if you want to. Think for a minute about what you're saying, Dave!
              {{ DiscussionBoard.errors[3528396].message }}
              • Profile picture of the author Dave Rodman
                Banned
                Originally Posted by Alexa Smith View Post

                So for backlinks to be relevant to you, they have to be "visible" to you, now, without looking in the supplemental index?! Visible to Google's no longer good enough?!

                "Interesting" new rule of SEO, there!
                You don't know

                a) If there even IS still a S.I.
                b) If there is still the S.I., you have no way of knowing if a page is there. Pretty much every major SEO confirms it's impossible to know, yet you claim to KNOW (not suspect) that links from S.I. pages and Main Index pages count the same.

                So again, if you don't KNOW if a page is in the S.I. or not (which you don't), then how can you claim that a link from a Main Index page and S.I. page count the same?

                I fully expect you to dodge the question.
                {{ DiscussionBoard.errors[3529913].message }}
                • Profile picture of the author Richard Van
                  I fully expect you to dodge the question.
                  Well, I fully expect the exact opposite and as it goes, you should too.
                  Signature

                  Wibble, bark, my old man's a mushroom etc...

                  {{ DiscussionBoard.errors[3530324].message }}
                • Profile picture of the author DireStraits
                  Originally Posted by Dave Rodman View Post

                  You don't know

                  a) If there even IS still a S.I.
                  b) If there is still the S.I., you have no way of knowing if a page is there. Pretty much every major SEO confirms it's impossible to know, yet you claim to KNOW (not suspect) that links from S.I. pages and Main Index pages count the same.

                  So again, if you don't KNOW if a page is in the S.I. or not (which you don't), then how can you claim that a link from a Main Index page and S.I. page count the same?

                  I fully expect you to dodge the question.
                  "Every major SEO"?

                  Yes, because major SEO'ers never talk major bollocks, do they, Dave?

                  They never lie to their customers about the complexity of what it is they do, as a means of justifying their often extortionate pricing. Or about their methods, for that matter.

                  And others in the SEO industry--those who sell SEO-related software tools/services to other SEO'ers--never ride off the backs of widely perpetuated, baseless, mythical "facts"/beliefs in order to sell their products/services or even justify the need for their existence (think: article spinning software, etc) ... right?

                  That's simply unheard of. :confused: :rolleyes:

                  And they surely can't be the same SEO experts who post on forums, such as this one, about the "dangers of building backlinks too quickly" and why you'll be "penalised for syndicating your content" and so on ... ?

                  Nor can they be the "armchair-SEO'ers" who spend all their time theorising, speculating and putting out nice-looking graphs and tables on their blogs, but who haven't actually gotten round to doing any real-world SEO for the last 10 years.

                  I'll grant you one thing: Google doesn't, from what I can see, employ the old "Supplemental Results" label in their SERPs. Mostly, I think, because it conferred no real meaning to the average searcher. As far as they were concerned, they'd performed a search and some results came back. Either they could find something among them that was relevant to their search and answered their query, or they couldn't. Anything else, such as "which index the result came from", was a sheer irrelevance.

                  But that doesn't mean it doesn't exist.

                  Anyway, I think what Alexa's referring to when she mentions the supplemental index, are the results that don't display unless you click a link similar to the one displayed below, at the bottom of the SERPs.



                  And before you say "those aren't supplemental results", let me ask you: what exactly are they, then? Because, to me, the words "in order to show you the most relevant results, we have omitted..." imply that the results they've omitted weren't deemed relevant enough (due to being similar/identical to other results that were already shown). And that was exactly one of the features of a "supplemental result", wasn't it? Essentially, they were "results normally excluded from the SERPs due to lack of relevance/importance/uniqueness, but displayed on this occasion for <whatever reason>".

                  Seems like the same thing to me. :p
                  {{ DiscussionBoard.errors[3530725].message }}
      • Profile picture of the author laurie390
        Originally Posted by JohnMcCabe View Post

        Duplicate content is copies of content on a single site under different file names. For example, most blog platforms are programmed to create duplicate content because the same post will appear in multiple sub-directories (category, date, tags, etc.). Eliminating the effects of all that duplicate content is one of the objectives of the various SEO plugins.

        Syndicated content is more like what is described above. Content, be it article, video or whatever is used on multiple sites with the express permission of the rights holder.
        Ok, from this definition, then do I understand this right? If you submit the same article to different article directories, it is syndicated content? And Google is ok with that? Please correct me if I have it wrong. Thanks:-)
        Signature

        Laurie Neumann
        Huge PLR Closeout Sale at Quality Internet Marketing PLR

        {{ DiscussionBoard.errors[3679217].message }}
        • Profile picture of the author JohnMcCabe
          Originally Posted by laurie390 View Post

          Ok, from this definition, then do I understand this right? If you submit the same article to different article directories, it is syndicated content? And Google is ok with that? Please correct me if I have it wrong. Thanks:-)
          Although your interpretation is a little narrow, you have the right idea. That same article appearing on someone else's blog, in the newsletter they send to their list, even included (intact) in a report or ebook or physical product would also be syndicated content.

          From what I've been able to tell, and from what others tell me, Google is perfectly okay with that. The only possible negative I see is the lack of control over which copy of the content Google will display for any particular search. The other side of the coin is that your content has that many more chances of having human eyeballs on it.
          {{ DiscussionBoard.errors[3681972].message }}
      • Profile picture of the author mkmossop
        Originally Posted by JohnMcCabe View Post


        From what I've been able to tell, and from what others tell me, Google is perfectly okay with that. The only possible negative I see is the lack of control over which copy of the content Google will display for any particular search. The other side of the coin is that your content has that many more chances of having human eyeballs on it.
        So in your opinion (and Alexa's it seems), in terms of getting backlinks, I'll get the same credit for submitting an unspun article to 200 directories as I would from submitting a highly spun article to the same directories (assuming I get the same percentage of each submissions indexed)?
        {{ DiscussionBoard.errors[3684829].message }}
        • Profile picture of the author DireStraits
          Originally Posted by mkmossop View Post

          So is that to say (in your opinion) there is no "duplicate content" rule for distributing the same article to hundreds of different sites and that article spinning is useless?

          I've seen people claim that their results are just as good without spinning, but you never know what to believe.
          Well, there is a filter, but no penalty.

          In other words, if you submit the same unspun/non-rewritten/non-unique article to lots of sites, not all of them will be able to individually rank in the SERPs because Google will filter out those it has deemed to be "duplicates", just in the interest of providing diversity to their users.

          But as far as the backlink strength and ability of your non-unique articles to send referral-traffic, you certainly do not need to be spinning or rewriting them, because this won't confer any advantages to you (but it may very well confer disadvantages if the quality of your articles is subpar as a result!).
          {{ DiscussionBoard.errors[3684877].message }}
          • Profile picture of the author mkmossop
            Originally Posted by DireStraits View Post

            Well, there is a filter, but no penalty.

            In other words, if you submit the same unspun/non-rewritten/non-unique article to lots of sites, not all of them will be able to individually rank in the SERPs because Google will filter out those it has deemed to be "duplicates", just in the interest of providing diversity to their users.
            So then this does seem to make a difference. If my duplicate articles aren't getting indexed, then I'm not getting credit for them. Or will Google still index these duplicates, and just simply not rank them? I'm talking purely from a backlinking perspective, not whether the articles show up in Google or not.
            {{ DiscussionBoard.errors[3684927].message }}
            • Profile picture of the author DireStraits
              Originally Posted by mkmossop View Post

              So then this does seem to make a difference. If my duplicate articles aren't getting indexed, then I'm not getting credit for them. Or will Google still index these duplicates, and just simply not rank them? I'm talking purely from a backlinking perspective, not whether the articles show up in Google or not.
              Sorry for the confusion!

              Yes, they will still get indexed and your backlinks will still count just the same; you don't need to spin or rewrite your articles for that, and there'd be no advantages to be had from doing so, in this case.
              {{ DiscussionBoard.errors[3684999].message }}
              • Profile picture of the author JohnMcCabe
                Originally Posted by mkmossop View Post

                So in your opinion (and Alexa's it seems), in terms of getting backlinks, I'll get the same credit for submitting an unspun article to 200 directories as I would from submitting a highly spun article to the same directories (assuming I get the same percentage of each submissions indexed)?
                Yes, you will...
                {{ DiscussionBoard.errors[3687289].message }}
                • Profile picture of the author mkmossop
                  Originally Posted by JohnMcCabe View Post

                  Yes, you will...
                  Originally Posted by DireStraits View Post

                  Sorry for the confusion!

                  Yes, they will still get indexed and your backlinks will still count just the same; you don't need to spin or rewrite your articles for that, and there'd be no advantages to be had from doing so, in this case.
                  Thanks for the replies. Guess I'll have to test this out cause I HATE spinning.

                  Also, say I were to build a few of my own web2.0 sites, I can use this same content again and still get solid backlinks from them?
                  {{ DiscussionBoard.errors[3690815].message }}
  • Profile picture of the author BlondieWrites
    I'm not sure there is a difference, other than the names. Both use content that's been published elsewhere.

    That said, I would say it's the two are the same.

    With PLR content, you can rewrite it (and should), but if the content being used is syndicated or used from another content type service, then you can't rewrite it or change it.


    Cindy
    Signature
    WAHM Daily -

    Working from home, work at home moms, make money online, internet marketing, PLR content
    {{ DiscussionBoard.errors[3482992].message }}
    • Profile picture of the author Unity96387
      Originally Posted by BlondieWrites View Post

      With PLR content, you can rewrite it (and should), but if the content being used is syndicated or used from another content type service, then you can't rewrite it or change it.
      PLR is syndicated content, right? If you have 50 exact PLR sites which all have exact syndicated content on them then how does Google treat those sites if nothing is changed?

      But if someone modifies the keywords in each page then how does that change things for that site?
      {{ DiscussionBoard.errors[3517176].message }}
      • Profile picture of the author DireStraits
        Originally Posted by Unity96387 View Post

        PLR is syndicated content, right? If you have 50 exact PLR sites which all have exact syndicated content on them then how does Google treat those sites if nothing is changed?
        It'll treat them the same way it usually treats multiple instances of the same (syndicated) content: they'll be filtered out into the supplemental index so as not to allow too many (if any) identical results in the SERPs for any single search-query.

        Originally Posted by Unity96387 View Post

        But if someone modifies the keywords in each page then how does that change things for that site?
        Changing the keywords in each article should obviously, in theory, make them relevant for different search-queries, and may allow them to rank in the main index. But whether that'll work or not remains to be seen and can only be answered by testing.

        Either way, the chances are good that you'd still need to build backlinks to those pages to have them rank half-decently, anyway. If you're going to put forth that amount of effort to rank a PLR article, you may as well take a little extra time to rewrite them, thus making them unique and significantly increasing their potential to rank in the main index for other long-tail search-queries, too - ones they otherwise might not have ranked for due to being non-unique, and therefore having to compete head-on for exposure with almost-identical articles across other sites with more SEO authority - no?
        {{ DiscussionBoard.errors[3517239].message }}
        • Profile picture of the author JohnMcCabe
          Originally Posted by Unity96387 View Post

          PLR is syndicated content, right? If you have 50 exact PLR sites which all have exact syndicated content on them then how does Google treat those sites if nothing is changed?

          But if someone modifies the keywords in each page then how does that change things for that site?
          First off, I don't agree that PLR is the same thing as syndicated content. While the license with most PLR packages does allow for publishing unchanged, most vendors will tell you upfront that doing so is not the smartest move you can make.

          Some PLR material may end up as syndicated content, but not all syndicated content is PLR. Compare the license that comes with most PLR packages with the terms of use at a directory site like EZA. The syndication sites (like article directories) and any private syndicate I've been a part of are almost polar opposites of PLR in the rights given.

          Definitions aside, finding 50 identical websites without some kind of deliberate intent is about as likely as finding identical twins with different parents. Google should treat such an occurrence as what it is, index one site and sandbox the rest.

          [Side track: It is scientifically possible to have genetically identical twins with no overlapping parents. A gold star for anyone who can explain how...]
          {{ DiscussionBoard.errors[3518749].message }}
          • Profile picture of the author heavyjay
            Originally Posted by JohnMcCabe View Post


            [Side track: It is scientifically possible to have genetically identical twins with no overlapping parents. A gold star for anyone who can explain how...]
            Double first cousins?
            Signature
            My New Blog - isn't much on it and your critique is more than welcome!
            {{ DiscussionBoard.errors[4191524].message }}
            • Profile picture of the author JohnMcCabe
              Originally Posted by heavyjay View Post

              Double first cousins?
              Two sets of identical twins marry and reproduce. Since both mothers and both fathers are genetically identical by definition, their offspring will also be genetically identical with no overlapping parentage.
              {{ DiscussionBoard.errors[4191573].message }}
              • Profile picture of the author heavyjay
                Originally Posted by JohnMcCabe View Post

                Two sets of identical twins marry and reproduce. Since both mothers and both fathers are genetically identical by definition, their offspring will also be genetically identical with no overlapping parentage.
                OK, I was close. Regular double first cousins would have the same DNA as siblings, not identical twins.

                Dolly Parton's great great grandfather and mine were double first cousins.

                That, and about $14 will get me a "Decaf, Double, Venti, 5 pump White mocha, 3 pump Peppermint, half soy, half non-fat, no foam, one hundred and eighty, light choclate whip, carmel drizzle inside and out, with cinnnamon and chocolate powder on top, white mocha" at Starbucks.
                Signature
                My New Blog - isn't much on it and your critique is more than welcome!
                {{ DiscussionBoard.errors[4191898].message }}
  • Profile picture of the author wanna-succeed
    Wait a sec, so if I have one of my articles, that I wrote myself, published on the main page of my site & in a special category, I have duplicate content??
    All the posts in a blog go to the main blog. After they reach the bottom, they are in a category as well, but you can find them in both places. AM I doing something wrong here?
    Signature

    No sig, good day m8...

    {{ DiscussionBoard.errors[3483231].message }}
    • Profile picture of the author Alexa Smith
      Banned
      Originally Posted by tat1973 View Post

      syndicated are articles that can be re-distributed
      Not quite right, I'm afraid.

      Articles which have been syndicated (not "that can be syndicated") to other sites are examples (not a definition) of syndicated content.

      Originally Posted by tat1973 View Post

      and duplicate is pretty much the opposite?
      No; I'm afraid this is also wrong. "Duplicate content" is correctly defined above, in John McCabe's post (#4).

      Originally Posted by wanna-succeed View Post

      Wait a sec, so if I have one of my articles, that I wrote myself, published on the main page of my site & in a special category, I have duplicate content??
      All the posts in a blog go to the main blog. After they reach the bottom, they are in a category as well, but you can find them in both places. AM I doing something wrong here?
      I think you need to explain just a tiny bit more clearly before anyone can answer this confidently.

      Being "able to find them twice" doesn't make it "duplicate content". That makes it sound like one is a link to the other?

      I have loads of articles which are listed under various different categories and can be called up by readers in various different ways by clicking on various category-links, but the text of the articles isn't duplicated in different files. It would be "duplicate content" only if the same text appears within two separate files in the same site. (And even then it doesn't mean that there'd be a "penalty" for it anyway: that's a whole different matter again.)
      {{ DiscussionBoard.errors[3483339].message }}
      • Profile picture of the author JohnMcCabe
        Originally Posted by wanna-succeed View Post

        Wait a sec, so if I have one of my articles, that I wrote myself, published on the main page of my site & in a special category, I have duplicate content??
        All the posts in a blog go to the main blog. After they reach the bottom, they are in a category as well, but you can find them in both places. AM I doing something wrong here?
        In general, the same file found via different urls would be, technically, duplicate content. For example:

        Code:
        example.com
        www.example.com
        both lead to the same page, the home page of the site. But since they have different addresses, they could be considered duplicate content.

        Google has a way, via webmaster tools, to tell them which is the "real" home page and which they can safely ignore. That's why people will tell you to pick one linking scheme (with or without the www) and stick to it.

        Blogs may be a special case. You can use special codes to tell Google which copy you consider the most important one. You can also use either an SEO plugin or robots.txt file to tell the spider which addresses to look at and which ones to ignore.

        That said, I believe they're familiar enough with how blog platforms work that they won't penalize you for something the software does automatically. They simply might not choose the same url as you would to consider as the 'real' one.
        {{ DiscussionBoard.errors[3489682].message }}
        • Profile picture of the author sjohn
          Thanks for all the responses to clear this up.
          {{ DiscussionBoard.errors[3494858].message }}
        • Profile picture of the author Marvin Johnston
          Originally Posted by JohnMcCabe View Post

          Google has a way, via webmaster tools, to tell them which is the "real" home page and which they can safely ignore. That's why people will tell you to pick one linking scheme (with or without the www) and stick to it.
          According to one recognized expert (Jerry West), there are problems with using Webmaster tools to try and address the "duplicate content" problem.

          The basic problem is called a canonicalization problem (www vs non-www) and he explains the problems as well as the solutions here:

          Fixing Internal Duplicate Content with a Non-www Site Redirect - How-to Guide

          He covers both Windows and Linux server solutions.

          Marvin
          {{ DiscussionBoard.errors[3518878].message }}
          • Profile picture of the author Kurt
            Originally Posted by JohnMcCabe View Post

            Duplicate content is copies of content on a single site under different file names. For example, most blog platforms are programmed to create duplicate content because the same post will appear in multiple sub-directories (category, date, tags, etc.). Eliminating the effects of all that duplicate content is one of the objectives of the various SEO plugins.

            Syndicated content is more like what is described above. Content, be it article, video or whatever is used on multiple sites with the express permission of the rights holder.
            Originally Posted by Alexa Smith View Post

            ^^^^^ This. Exactly.

            The two are completely different.

            In terms of their location, in SEO terms, in terms of their significance, and all that other stuff.

            If it's on different sites, then it isn't "duplicate content" as far as Google is concerned. So don't think of it as "duplicate content" in your own mind, either, because for SEO purposes, and for generally "understanding what you're doing in internet marketing" too, having a conceptual/definition framework that's so out of line with Google's probably isn't really the best starting position.

            Of course, there's nothing to stop anyone from being like Humpty Dumpty in Through The Looking Glass, if they want to be like that ("When I use a word," Humpty Dumpty said, in a rather a scornful tone, "it means just what I choose it to mean--neither more nor less") but if you allow yourself to imagine that what's really "syndicated content" is "duplicate content" (just because you "choose to define it that way"), then it's going to draw you into all sorts of confusions, problems and misunderstandings.

            And before you know where you are you'll start believing that there's a "duplicate content penalty" for things that aren't even "duplicate content" in the first place, and all sorts of nonsense like this.

            Can you guys please explain what Google's duplicate content filter is and how it applies to different domains? Thanks.
            {{ DiscussionBoard.errors[3518947].message }}
            • Profile picture of the author KingArthur
              JohnMcabe and Direstraits can you answer these questions? I found them on another thread but they are the very questions I need answers for also. So I will change some sentences and add some of my own sentences to tailor make them for my own situation. If I can get answers for these it will help me to save a lot of time:



              If I am a part of one of those membership clubs where every member of the club (from 50 up to 150 members) gets the same sites and pages, then these sites are made up of syndicated content and not duplicate content, right? And Google treats them differently in the search results?



              If I change the keywords on the pages so that my pages are called up for different keywords (than the pages of the sites of the other members who leave the pages the way they got them) then that should work in giving me the uniqueness I need to get different traffic from different keywords? Most of the other members do not change anything on their pages.



              What if I only want to rank for the home page and I completely rewrite the home page and the first sentence of each paragraph in the other 19 pages of the site? Will I be able to rank for the home page since Google ranks pages and not sites? Or, will my whole site be sand boxed since there is so much syndicated PLR content on the other pages that is not rewritten (only the first sentence of each paragraph is)?



              The new google farmer penalty does not affect my membership site home page and I can theoretically get #1 in Google because the home page is completely rewritten and the domain key word is unique compared to the other 50 to 150 membership sites? This ranking can happen even though most of the content on the other pages is not rewritten?


              The King
              {{ DiscussionBoard.errors[3519252].message }}
    • Profile picture of the author Istvan Horvath
      Originally Posted by wanna-succeed View Post

      Wait a sec, so if I have one of my articles, that I wrote myself, published on the main page of my site & in a special category, I have duplicate content??
      All the posts in a blog go to the main blog. After they reach the bottom, they are in a category as well, but you can find them in both places. AM I doing something wrong here?
      I don't think you are doing anything wrong.

      There used to be a time when the fact that different queries can bring up the same content (post) from the database and display it in different templates... was a real issue.
      Talking of a WP blog, your post could be:
      - on your main/front page
      - on the single post view
      - in the category archive list
      - in the monthly archive list
      (but you can create daily, weekly, yearly archives, too!)
      - tag archives

      Then according to the G. people, search engines got smarter and smarter and they are aware how a blogging script like WP works... so they won't punish you ONLY for this.

      Unfortunately, I don't bookmark all the Cutts interviews - although I remember reading (viewing?) it.
      Actually, even the official Google blog works in that way: showing the post once on the main page and
      once again on the single post view.

      All that said: it never hurts if you are using only the_excerpt on your multipost listings and have the full post displayed only in single-post view.

      And now the mandatory "on the other hand": I have an older blog showing full posts on the main page and in single post view as well, etc. Some posts for certain 2 keyword phrases are on the page #1 in SERP for years... because they are the best content on the topic :p
      Signature

      {{ DiscussionBoard.errors[3518857].message }}
  • Profile picture of the author tat1973
    syndicated are articles that can be re-distributed and duplicate is pretty much the opposite?
    {{ DiscussionBoard.errors[3483272].message }}
  • Profile picture of the author MagicWhisper
    I thought that syndicate content was when you distribute YOUR OWN work to other site on the internet, having them under the same (your) name at each site. And duplicate content was when someone else takes your work and distribute it under their name or someone else's name. Am I wrong? Someone please correct me if I am.
    {{ DiscussionBoard.errors[3520350].message }}
    • Profile picture of the author Alexa Smith
      Banned
      Originally Posted by MagicWhisper View Post

      I thought that syndicate content was when you distribute YOUR OWN work to other site on the internet, having them under the same (your) name at each site. And duplicate content was when someone else takes your work and distribute it under their name or someone else's name. Am I wrong?
      Yes.

      Duplicate content is defined above in John McCabe's post (#4).

      Syndicated content can be syndicated by yourself and/or by others. "Who has done it" has nothing to do with whether or not it's been done.
      {{ DiscussionBoard.errors[3520366].message }}
      • Profile picture of the author JohnMcCabe
        Originally Posted by Marvin Johnston View Post

        According to one recognized expert (Jerry West), there are problems with using Webmaster tools to try and address the "duplicate content" problem.

        The basic problem is called a canonicalization problem (www vs non-www) and he explains the problems as well as the solutions here:

        Fixing Internal Duplicate Content with a Non-www Site Redirect - How-to Guide

        He covers both Windows and Linux server solutions.

        Marvin
        Thanks, Marvin. I forgot all abut using the .htaccess file. I've been doing essentially the same thing through the domain registrar's control panel for years.

        Originally Posted by Kurt View Post

        Can you guys please explain what Google's duplicate content filter is and how it applies to different domains? Thanks.
        I think we can all pretty much agree that seeing the same article, or other piece of content, dominating the front page of the search results is undesirable. At least from Google's point of view.

        So when the spiders and bots do their thing, and the black box determines that when a certain query is run, which would put the same article on the front page multiple times, the black box will pick one of those copies to display.

        The duplicate content filter is the algorithm which determines which copy to display. While the actual factors and weightings are a secret, some of the factors are believed to be the original date of indexing, the authority and trust of the site containing the copy, the number and type of links to the copy, the authority and trust of the lining site, and so on in a 'drop a rock in a pond and watch the expanding rings' kind of progression.

        Some believe geography plays a role in which copy you see. If so, a searcher in the UK and one in the USA might see the same article pop up for a search, but the UK searcher would see a copy on a UK site and the US searcher on a US site.

        Originally Posted by KingArthur View Post

        JohnMcabe and Direstraits can you answer these questions? I found them on another thread but they are the very questions I need answers for also. So I will change some sentences and add some of my own sentences to tailor make them for my own situation. If I can get answers for these it will help me to save a lot of time:



        If I am a part of one of those membership clubs where every member of the club (from 50 up to 150 members) gets the same sites and pages, then these sites are made up of syndicated content and not duplicate content, right? And Google treats them differently in the search results?
        While I might quibble with the notion that the content packages you describe are actually syndicated content as opposed to some type of PLR, the effect is close enough for this discussion.

        The sites and pages you get from your membership are not duplicate content as Google uses the term, and they would be treated as individual pages for determining search position.

        Originally Posted by KingArthur View Post

        If I change the keywords on the pages so that my pages are called up for different keywords (than the pages of the sites of the other members who leave the pages the way they got them) then that should work in giving me the uniqueness I need to get different traffic from different keywords? Most of the other members do not change anything on their pages.
        It depends on what you mean by "change the keywords." If you mean simply changing the meta keyword list in the header, then you really haven't changed a significant portion of the page.

        On the other hand, if you went through and changed "toy train" to "model railroad" on a page, you would very likely pull traffic for different keywords.

        The people who change nothing all take their chances on the duplicate content filter I described above. Note, I said filter, not penalty.


        Originally Posted by KingArthur View Post

        What if I only want to rank for the home page and I completely rewrite the home page and the first sentence of each paragraph in the other 19 pages of the site? Will I be able to rank for the home page since Google ranks pages and not sites? Or, will my whole site be sand boxed since there is so much syndicated PLR content on the other pages that is not rewritten (only the first sentence of each paragraph is)?
        You should be able to affect your home page's ranking by rewriting the home page, without rewriting a single sentence of the other pages.

        Look at the oft-cited example of the major news sites. Most of them are filled with pages containing identical articles published in many other places.


        Originally Posted by KingArthur View Post

        The new google farmer penalty does not affect my membership site home page and I can theoretically get #1 in Google because the home page is completely rewritten and the domain key word is unique compared to the other 50 to 150 membership sites? This ranking can happen even though most of the content on the other pages is not rewritten?


        The King
        Only time will tell for sure, but that looks like a likely outcome.
        {{ DiscussionBoard.errors[3522335].message }}
        • Profile picture of the author Kurt
          Originally Posted by JohnMcCabe View Post

          I think we can all pretty much agree that seeing the same article, or other piece of content, dominating the front page of the search results is undesirable. At least from Google's point of view.

          So when the spiders and bots do their thing, and the black box determines that when a certain query is run, which would put the same article on the front page multiple times, the black box will pick one of those copies to display.

          The duplicate content filter is the algorithm which determines which copy to display. While the actual factors and weightings are a secret, some of the factors are believed to be the original date of indexing, the authority and trust of the site containing the copy, the number and type of links to the copy, the authority and trust of the lining site, and so on in a 'drop a rock in a pond and watch the expanding rings' kind of progression.

          Some believe geography plays a role in which copy you see. If so, a searcher in the UK and one in the USA might see the same article pop up for a search, but the UK searcher would see a copy on a UK site and the US searcher on a US site.

          .
          Doesn't this contradict the notion that duplicate content is only content on the same domain?

          And when claiming there is no duplicate content "penalty", isn't it remiss not to mention a duplicate content "filter"? While they are a little different, if a page isn't going to be displayed in the SERPs, what difference does it make if a page is "penalized" or "filtered"?
          {{ DiscussionBoard.errors[3523283].message }}
          • Profile picture of the author DireStraits
            Originally Posted by Kurt View Post

            Doesn't this contradict the notion that duplicate content is only content on the same domain?

            And when claiming there is no duplicate content "penalty", isn't it remiss not to mention a duplicate content "filter"? While they are a little different, if a page isn't going to be displayed in the SERPs, what difference does it make if a page is "penalized" or "filtered"?

            Kurt,

            The problem is that with real duplicate content (i.e., not syndicated content), the filtering out of those pages into the supplemental index isn't where the "penalty" ends: Google may - and does - at its discretion (and if it suspects you have duplicated that content with malicious intent - e.g., on doorway pages, etc.), penalise the rankings of your entire site - not just the offending page(s).

            Naturally, this doesn't happen with syndicated content - no such penalty exists for it - and that's why it's so important to appreciate the distinction between that and duplicated content, and also explains why the words "filter" and "penalty" are used, respectively, to describe the potential effects of each: one is simply an effect of Google's attempt to provide richness and diversity in their search results, for the sake of providing a quality user experience; the other one is a genuine punishment for techniques that Google considers "b.lackhat", against their policies, and simply won't tolerate.
            {{ DiscussionBoard.errors[3523409].message }}
            • Profile picture of the author Kurt
              Originally Posted by DireStraits View Post

              Kurt,

              The problem is that with real duplicate content (i.e., not syndicated content), the filtering out of those pages into the supplemental index isn't where the "penalty" ends: Google may - and does - at its discretion (and if it suspects you have duplicated that content with malicious intent - e.g., on doorway pages, etc.), penalise the rankings of your entire site - not just the offending page(s).

              Naturally, this doesn't happen with syndicated content - no such penalty exists for it - and that's why it's so important to appreciate the distinction between that and duplicated content, and also explains why the words "filter" and "penalty" are used, respectively, to describe the potential effects of each: one is simply an effect of Google's attempt to provide richness and diversity in their search results, for the sake of providing a quality user experience; the other one is a genuine punishment for techniques that Google considers "b.lackhat", against their policies, and simply won't tolerate.
              I thought others said there isn't a duplicate content penalty?

              And why is it called a "duplicate content filter" and not a "syndicated content filter"?

              I'll answer my own question...Because Google sees "syndicated" content as duplicate content, despite the claims on this thread.

              And Google also differentiates between duplicate content and "black hat/malicious" content, so I'm not buying this example. As stated above, there is no duplicate content penalty. In my 15 years of SEO experience, Google simply ignores what it feels is duplicate content.

              The truth is, syndicated content is duplicated content, and the phrase "syndicated content" is one used by article marketers on this forum and NOT Google.
              {{ DiscussionBoard.errors[3523497].message }}
              • Profile picture of the author DireStraits
                Originally Posted by Kurt View Post

                I thought others said there isn't a duplicate content penalty?

                And why is it called a "duplicate content filter" and not a "syndicated content filter"?

                I'll answer my own question...Because Google sees "syndicated" content as duplicate content, despite the claims on this thread.

                And Google also differentiates between duplicate content and "black hat/malicious" content, so I'm not buying this example. As stated above, there is no duplicate content penalty. In my 15 years of SEO experience, Google simply ignores what it feels is duplicate content.

                The truth is, syndicated content is duplicated content, and the phrase "syndicated content" is one used by article marketers on this forum and NOT Google.
                Kurt,

                I appreciate that Google themselves may not all too often apply a unique label to each - perhaps because they see them, for the most part (as you just said), as being one and the same; but the fact remains that one practice can invoke a site-wide penalty, and the other one cannot ... or at least doesn't.

                I do try, myself, to refer to the mere filtration of non-unique content as the "duplicate/syndicated content filter" rather than a penalty. You're right: both, to an extent, and in most normal cases, undergo the exact same treatment. Google doesn't penalise all sites with duplicated content - only usually those it has identified as having utilised it as a component of "insidious" (b.lackhat) SEO techniques.

                Having said that, given the potential (not certain) implication of a penalty for duplicated content, I think it's wise, helpful and wholly justified to further clarify the distinction between duplicated and syndicated content, by using different terms to describe each. It cuts down (or would do, if more people appreciated the difference) on the volume of conversations on the subject in which participants are talking at cross-purposes to one another.
                {{ DiscussionBoard.errors[3523668].message }}
          • Profile picture of the author Alexa Smith
            Banned
            Originally Posted by Kurt View Post

            Doesn't this contradict the notion that duplicate content is only content on the same domain?
            No; it doesn't.

            Originally Posted by Kurt View Post

            if a page isn't going to be displayed in the SERPs, what difference does it make if a page is "penalized" or "filtered"?
            Now you're apparently confusing two different things - but we both know that you know better than that.

            This is exactly how the myth of a "duplicate content penalty" grew up in the first place, by two different things apparently being confused by "spinning" enthusiasts with either a financial or an emotional investment in being able to justify a misguided perspective. (I don't for a moment suggest, of course, that you have any financial investment in your perspective. Far from it!).

            It makes all the difference in the world whether a page is penalised or filtered. For the simple reason that if a page is filtered, it goes into the supplemental index where its backlink still counts just the same as any other non-filtered backlink. Being filtered, in other words doesn't affect its backlink(s) - no negative effect on SEO of the site linked to.

            Clear now?

            So, people "spinning" articles and mass-submitting to article directories, in order to get each into Google's main index carrying a backlink to their site, rather than simply submitting syndicated copies to those same directories and getting most/many of them "only" into the supplemental index carrying a backlink to their site, are gaining nothing in backlink terms.

            People (not yourself of course, Kurt, but many people reading discussions of this nature) imagine that when they do this, it's THEIR SITE (i.e. the site linked to in the articles' resource-boxes) that's going to incur some sort of "penalty" or be "filtered".

            They imagine that submitting syndicated rather than spun copies is going to impact their own site negatively.

            You only have to look at this section of the forum any day of the week, to see that. At least once a day there's a new thread started off by someone imagining this to be the case. And often many people post in it saying "Yes, that's right: you'll get your site penalised for duplicate content if you do that".

            It's all nonsense, of course.


            And the reason so many people believe it is that other people with their "spinning" hats on want them to believe it, and perpetuate it. Sometimes because they want to sell them things, but more often just because they want to continue believing it themselves.

            It would be doing the forum's members and the wider internet marketing community a far greater service to expose "spinning" for what it really is.

            Here's Paul Myers' view, if you don't like mine. Oops, sorry, they're the same, aren't they?
            {{ DiscussionBoard.errors[3523625].message }}
            • Profile picture of the author JohnMcCabe
              Originally Posted by 2ndopkate View Post

              John,

              Where are you doing that in the registrar control panel? Is it in the Nameserver panel?

              Thanks,
              Kater
              Kater, it's in the host records section. I set the default host record to the domain name (example.com), then automatically redirect a request for the www. version to the default.

              It's like when affiliates redirect a domain to an affiliate site, only I'm redirecting to the version of the url I want to be the default.
              {{ DiscussionBoard.errors[3523689].message }}
            • Profile picture of the author Kurt
              Originally Posted by Alexa Smith View Post

              No; it doesn't.



              Now you're apparently confusing two different things - but we both know that you know better than that.
              Nope. I'm not confusing ANYTHING. I merely want you to explain the difference to people reading this thread who may not know the difference.

              You have mistaken my questions as a request for knowledge for myself instead of wanting more info for people you are trying to "educate" and who may not be knowledgable to ask the questions that need to be asked.


              This is exactly how the myth of a "duplicate content penalty" grew up in the first place, by two different things apparently being confused by "spinning" enthusiasts with either a financial or an emotional investment in being able to justify a misguided perspective. (I don't for a moment suggest, of course, that you have any financial investment in your perspective. Far from it!).
              I've never said there was a duplicate penalty, and have posted so many times. Earlier in this thread I even posted that in my 15 years of SEO experience, I've found Google simply ignores doop content.


              It makes all the difference in the world whether a page is penalised or filtered. For the simple reason that if a page is filtered, it goes into the supplemental index where its backlink still counts just the same as any other non-filtered backlink. Being filtered, in other words doesn't affect its backlink(s) - no negative effect on SEO of the site linked to.

              Clear now?
              I was always clear on the issue. However, it seems you're not. The issue is that what you call "syndicated content" is in reality DOOP content, using Google's definitions, not mine or your's.


              So, people "spinning" articles and mass-submitting to article directories, in order to get each into Google's main index carrying a backlink to their site, rather than simply submitting syndicated copies to those same directories and getting most/many of them "only" into the supplemental index carrying a backlink to their site, are gaining nothing in backlink terms.
              I agree. But you left out that they can gain something by having additional pages included in the SERPs and for different keywords that won't be "filtered".


              People (not yourself of course, Kurt, but many people reading discussions of this nature) imagine that when they do this, it's THEIR SITE (i.e. the site linked to in the articles' resource-boxes) that's going to incur some sort of "penalty" or be "filtered".

              They imagine that submitting syndicated rather than spun copies is going to impact their own site negatively.
              If this is the case, then I agree.


              You only have to look at this section of the forum any day of the week, to see that. At least once a day there's a new thread started off by someone imagining this to be the case. And often many people post in it saying "Yes, that's right: you'll get your site penalised for duplicate content if you do that".

              It's all nonsense, of course.

              It is also nonsense to claim "syndicated" content is different than "doop" content, other than sydicated content appears on other sites. Google considers similar content to be duplicate content even when placed on other sites.



              And the reason so many people believe it is that other people with their "spinning" hats on want them to believe it, and perpetuate it. Sometimes because they want to sell them things, but more often just because they want to continue believing it themselves.

              It would be doing the forum's members and the wider internet marketing community a far greater serivce to expose "spinning" for what it is.

              As I've replied to you on other threads, it's the strategy use for spinning, not the technique itself.




              The issue isn't what you or Paul believe (or me),it's what Google believes. Abd Google does address that the same content on different sites is duplicate content. And this is my main point of contention and spinning has nothing to do with it.

              Stop telling people that there's a difference between syndicated content and doop content.

              Paul said:
              "The way I do it is different." If so, good for you. You're just stuck in a game that's going to end up a losing bet.
              And the record, I disagree with Paul posted on that thread and do believe it's how you do it....And Paul's comments won't change my own 15 years experience of spinning using my own technique.

              15 years and counting...I'm still waiting for the "losing bet".

              One more time...I don't spin individual words, but rather entire chunks of info, such as "100 dog training tips". I can spin 100 tips into hundreds of different articles, each using 3-7 different combinations of tips, so each article is usually 60-100% unique from any other dog training tips article.

              And by unique, I mean unique ideas and tips, not merely unique text strings.

              But again, my point was never about spinning, it was about confusing people that there's a difference between syndicated and doop content. Sorry, but Google doesn't see it that way and the responses to my questions above prove my point.
              {{ DiscussionBoard.errors[3524045].message }}
              • Profile picture of the author Alexa Smith
                Banned
                Originally Posted by Kurt View Post

                You have mistaken my questions as a request for knowledge for myself
                Not at all: I knew the second I saw your post - as did others - that your motivation for asking certainly wasn't to try to learn anything.

                I won't say more, except that I think it's in everyone's interests for you and I to agree to disagree about these matters. To quote John McCabe (addressing someone else yesterday on virtually the same subject): "I made my pitch, and it bounced off a locked door".
                {{ DiscussionBoard.errors[3524179].message }}
              • Profile picture of the author DireStraits
                Originally Posted by Kurt View Post

                It is also nonsense to claim "syndicated" content is different than "doop" content, other than sydicated content appears on other sites. Google considers similar content to be duplicate content even when placed on other sites.
                Kurt,

                Again, there is a difference in some cases - and it's a significant and substantial one.

                Content duplicated between websites (i.e., what we refer to as "syndicated content", so as to avoid ambiguity) doesn't confer any adverse consequences to your entire site. The direct SEO / traffic benefits of publishing such content may be slim to none, but you won't be any worse off for publishing it.

                Publishing duplicates of the same content across multiple pages of a single website (i.e., what we refer to as "duplicate content") can confer negative consequences for your entire site, if Google thinks the reason behind that duplication is for the purpose of spamming and/or gaming their SERPs, using techniques they disapprove of.

                I realise that not all cases of duplicated content result in penalisation - but that doesn't alter the fact it can happen, and of the effects being potentially disastrous if it did.

                Surely you're not implying that, from an SEO perspective, the difference between little/nothing to gain and a lot/everything to lose is so inconsequential as to fail to justify the need for a separate term to describe the practice (or the "type of content") leading to each.

                How else do you propose to help people make the distinction between their effects if they see them both as one and the same, due to being branded with the same name?
                {{ DiscussionBoard.errors[3524228].message }}
        • Profile picture of the author 2ndopkate
          Originally Posted by JohnMcCabe View Post

          Thanks, Marvin. I forgot all abut using the .htaccess file. I've been doing essentially the same thing through the domain registrar's control panel for years.
          John,

          Where are you doing that in the registrar control panel? Is it in the Nameserver panel?

          Thanks,
          Kater
          {{ DiscussionBoard.errors[3523450].message }}
    • Profile picture of the author Dele
      Originally Posted by MagicWhisper View Post

      And duplicate content was when someone else takes your work and distribute it under their name or someone else's name. Am I wrong? Someone please correct me if I am.
      Yes you are wrong. That is plagiarism even though duplicate content issues also arise from the re-produced content since it has an original somewhere else.

      If however it was published with the original owner's name and links intact, no plagiarism would be involved, it would only be duplicate content. But note as explained in my next post, duplicate content across sites is not necessarily penalized by Google except in certain specified circumstances.
      Signature

      What Others Are Saying About This Top MLM Company | Get Brand New, Brand Name Products For Pennies @ New Penny Auctions | Play Online Game At Eager Zebra Games | The source through which i smile to the Bank daily with $$$ => Top Home Based Businesses

      {{ DiscussionBoard.errors[3691407].message }}
      • Profile picture of the author DireStraits
        Originally Posted by Dele View Post

        Yes you are wrong. That is plagiarism even though duplicate content issues also arise from the re-produced content since it has an original somewhere else.
        SEO-related duplicate content issues (i.e., penalisation of your site in the SERPs) usually only arise if the content duplicated has been done so across multiple pages of the same site, and in such a way as to give Google reason to judge it as an attempt to game their search-results in ways that are in violation of their accepted practices. In (most) other cases, all that happens is that it's filtered out of the SERPs.

        Content duplicated (or "syndicated", rather) across multiple sites - whether legitimately (with the original author's/owner's consent) or otherwise - doesn't usually pose any risk of "duplicate content issues" arising from an SEO perspective, in the sense that your own site's good standing and ranking positions in the SERPs will not be negatively impacted.

        It might pose a problem in the sense that if it's illegitimately copied, it's plagiarism / copyright infringement, but that's a different matter entirely (and not something you'd usually label a "duplicate content issue", but rather an issue of legality).
        {{ DiscussionBoard.errors[3691471].message }}
        • Profile picture of the author klauzer
          I hope it is ok to write in this thread.

          I have problems with duplicate content. Well that's what Google are saying :-(

          Anyway. I got my site deindexed about 2 months from now. And I know that my site was not unique.
          So I decided to clean my site and start from beginning.
          After that I approved about 10 articles and send my site to Google for reconsideration.
          And guess what, my site still was not good enough for Google. Remember that I only had about 10 articles on my site. They where (I think) spun article. So they where not 100% unique. I checked for duplicate content and they where about 2-3 sites that had some sentence with the same content that my site.

          So I don't know anymore what to do. My site has been deindexed now 2 months. Can it be de design of the site that makes my site not goog enoth for google?
          {{ DiscussionBoard.errors[3815551].message }}
  • Profile picture of the author Kurt
    Let me also add...I believe the "typical" spun content IS doop content, only the text strings are "unique" and have posted this opinion before.

    However, the OP's question is a simple one: "Duplicate vs Syndicated Content" and any discussion of spinning isn't relevant to the topic.

    Syndicated content is simply duplicate content posted on two or more different domains.
    {{ DiscussionBoard.errors[3524116].message }}
  • Profile picture of the author KingArthur
    It is funny that a lot of my sites are not showing up in the search results.


    What tools can I use to see if my site has been sandboxed? If I could find definitive tools it would be great. Is there a way to find which page of google my site is appearing on without having to look through many pages manually?


    I know that the only reason that my sites could be sand boxed after 10 weeks of being backlinked nicely is because there are a number of other sites with the same content on other sites? Remember, this would be identical content on other sites on other sites that appears on the other 19 pages of each site and not on my home page since my home page is completely rewritten.


    Thanks to the people who have helped me so far on this thread,


    King Arthur


    Update: It looks like most of my sites could be sandboxed because they are less than 6 to 8 months old and it could go up to 2 years: http://www.ehow.com/how_4875384_tell...e-sandbox.html
    {{ DiscussionBoard.errors[3526893].message }}
  • Profile picture of the author FogHorn
    [DELETED]
    {{ DiscussionBoard.errors[3527468].message }}
    • Profile picture of the author Richard Van
      Originally Posted by FogHorn View Post

      Easy, if the dupe content is published on different sites (sites that do not share the same server) you are all good, you can do it all day long, in fact the big G will reward you for it ! (Personal Exp)
      Reward you? How will it do that?
      Signature

      Wibble, bark, my old man's a mushroom etc...

      {{ DiscussionBoard.errors[3527708].message }}
  • Profile picture of the author Dele
    Duplicate content is more or less identical content found on the same or on different websites. Syndicated content is one way by which duplicate content can arise.

    The confusion arises because not all duplicate content is penalized by Google. Duplicate content on different sites is not penalized by Google unless in certain specified circumstances. On the other hand, most duplicate content on the same site is penalized by Google but here again there are some exceptions specified where it is not penalized.

    Note that "duplicate" is an english word and in the dictionary and has only one meaning which is more or less "mirror". It is not the coinnage of Google and so whenever there is more or less a "mirror" of another content, it is a "duplicate" of that content.

    This post explains this duplicate content issue in more details

    => Duplicate Content Controversy : Finally Laying It To Rest | Profitable Business Ideas
    Signature

    What Others Are Saying About This Top MLM Company | Get Brand New, Brand Name Products For Pennies @ New Penny Auctions | Play Online Game At Eager Zebra Games | The source through which i smile to the Bank daily with $$$ => Top Home Based Businesses

    {{ DiscussionBoard.errors[3691421].message }}
  • Profile picture of the author Rich77sm
    Sorry but i doubt your site will ever get reindexed.
    I keep wondering when the people with deindexed dup content sites are going to chime in. this is blowing my mind and the dust isnt settling in there!
    everything i thought about content is changing.

    i just hear so much about people being indexed...
    Signature
    {{ DiscussionBoard.errors[3831283].message }}
  • Profile picture of the author jakecoop79
    For those who believe that spinning articles for article marketing is not necessary (just for backlinks, not for ranking the article) ...

    Is it ok to use PLR for this?
    Signature

    {{ DiscussionBoard.errors[4094343].message }}

Trending Topics