Can offsite SEO overcome syndication?

by xento
10 replies
  • SEO
  • |
Suppose you use syndicated content on your website. Now suppose also that this syndicated content is badly seo optimized, lacks a good backlinking strategy and just mostly sits on the web doing nothing.

Do you think that if you republish this content and put some effort into promoting it, you could actually make it shine and rank it better than the original? Obviously keeping the author's resource box intact so it would be a win-win situation.
#offsite #overcome #seo #syndication
  • Profile picture of the author awj888
    essentially making the onpage SEO for the articles better? - sounds like a good plan! - though making too many changes the original author may not like that!
    if its a case of keywords, title tags, Heading tags, and alt images etc. cant do any harm
    i use clickbump seo on a lot of my wordpress blogs - it check the onpage stuff pretty nicely and gives a checklist - its essentially SEOPressor (much more known) but half the price +++ it has the LSI google keywords so that can really help with your rankings especially long tail!

    (im not promoting any of those products just personal experience)
    Signature

    :: AWJ of Thinking Creativity :: My wifey is a photographer, check out her work @ OLEXIE ::

    {{ DiscussionBoard.errors[5073868].message }}
  • Profile picture of the author C Rebecca
    First thing, if you are publishing syndicated content on your website without any reference source, that will be flagged as duplicate content.

    Second, if you are republishing it and promoting it that will also be flagged as duplicate content. Because this content is already been published on some website.
    Signature

    FREE 30 minutes of Ecommerce Marketing consultation. Consulted clients like Overstock.com, About.com, Lowe's and more...
    Book at: hello@techzui.com

    {{ DiscussionBoard.errors[5075924].message }}
    • Profile picture of the author xento
      Originally Posted by C Rebecca View Post

      First thing, if you are publishing syndicated content on your website without any reference source, that will be flagged as duplicate content.

      Second, if you are republishing it and promoting it that will also be flagged as duplicate content. Because this content is already been published on some website.
      I understand crediting original sources and have no problem with that.

      However, I fail to see the logic in being so careful with duplicated content. I mean if I have a site about mustang cars and there is an awesome article about new mustangs and the author allows syndication, I would love to give this content to my visitors at the cost of a few links because it adds value to my own site. Same with news stories. If I want to provide up to date info to my visitors, then I will certainly pull the latest news concerning mustang cars.

      Also, I read elsewhere in this forum that duplicate content refers to identical content on more than one page of the same website, rather than duplication across the web, and it makes more sense since a popular article will certainly get shared all over the place and google cannot possibly penalize them all for doing this.
      {{ DiscussionBoard.errors[5078412].message }}
      • Profile picture of the author dburk
        Hi xento,

        Yes, you definitely can get your content from syndicated sources to outrank the original. It happens all the time. Essentially, whoever does the most effective promotion to their page will rank the page.

        Keep in mind that since there are a number of duplicates, you will often get only one page of that duplicate content into a particular SERP. This is due to the duplicate content filter. All but one page is filtered, the page that wins ranks and the first runner up is filtered.

        So, not only must you outrank all other content for your targeted keyword, you must also outrank all of the duplicates, else your page wont show in SERP at all.
        {{ DiscussionBoard.errors[5078497].message }}
        • Profile picture of the author xento
          Originally Posted by dburk View Post

          Hi xento,

          Yes, you definitely can get your content from syndicated sources to outrank the original. It happens all the time. Essentially, whoever does the most effective promotion to their page will rank the page.

          Keep in mind that since there are a number of duplicates, you will often get only one page of that duplicate content into a particular SERP. This is due to the duplicate content filter. All but one page is filtered, the page that wins ranks and the first runner up is filtered.

          So, not only must you outrank all other content for your targeted keyword, you must also outrank all of the duplicates, else your page wont show in SERP at all.
          I was not really planning on going after the same result page. If allowed maybe do minor editing to target longtail keywords instead.

          Originally Posted by AlphaWarrior View Post

          I have actually used articles from an article directory and changed or modified the keywords for SEO purposes. But no real major changes. I have always left the resource box intact. If an author ever contacts me, I will simply remove the article from my site. But I can't imagine an author fussing since I do not change the character of the articles and I leave the resource box intact so that the authors get a link and possible viewers.
          This is actually pretty much what I had in mind. May I ask if you've had any good results using the technique?
          {{ DiscussionBoard.errors[5079696].message }}
        • Profile picture of the author Barry Unruh
          Originally Posted by dburk View Post

          Keep in mind that since there are a number of duplicates, you will often get only one page of that duplicate content into a particular SERP. This is due to the duplicate content filter. All but one page is filtered, the page that wins ranks and the first runner up is filtered.
          Which is exactly why a search for the United States Constitution Full Text only returns 7,990,000 results....

          Hmmm...maybe someone should tell those other 7,989,999 sites to quit messing up the search results....

          Then go ahead and do a search for the lyrics for the kids song "London Bridge is Falling Down"...only 251,000 results...

          Did those 250,999 other sites miss the memo???

          Quit it with that duplicate content filter stuff...unless you can prove it.
          Signature
          Brain Drained...Signature Coming Soon!
          {{ DiscussionBoard.errors[5082248].message }}
          • Profile picture of the author dburk
            Originally Posted by Barry Unruh View Post

            Which is exactly why a search for the United States Constitution Full Text only returns 7,990,000 results....

            Hmmm...maybe someone should tell those other 7,989,999 sites to quit messing up the search results....

            Then go ahead and do a search for the lyrics for the kids song "London Bridge is Falling Down"...only 251,000 results...

            Did those 250,999 other sites miss the memo???

            Quit it with that duplicate content filter stuff...unless you can prove it.
            Hi Barry,

            There is no need to send out a memo to those "sites to quit messing up the search results". I just checked and the Duplicate Content Filter seems to be working just fine on those SERPs.



            Google, like all other popular search engines return a maximum of 1000 results per query. So you will never be able to find a listing for a page at position 1001.

            Google employes duplicate content filters in several ways.

            1. They apply a filter within their crawler mapping code. After a large number of pages with the same content has been indexed they will stop indexing new pages. This prevents the web crawler from wasting valuable resources.

            2. When there is a large number of pages of nearly identical content they filter the main index and move additional pages into the supplemental index. This again saves resources by removing duplicate content from the main index.

            3. The actual SERP will filter identical content, whenever there are other relevant pages available, to provide diversity in results for users.

            So, no worries, Google duplicate content filter is working, however if you do ever come across an example where it doesn't seem to be working then please let me know, we can co-author a memo to Google to let them know that their duplicate content filter isn't working.

            Here is what Google has to say about their duplicate content filter:

            Official Google Webmaster Central Blog: Deftly dealing with duplicate content

            Official Google Webmaster Central Blog: Demystifying the "duplicate content penalty"

            Official Google Webmaster Central Blog: Duplicate content due to scrapers





            And a little on the silly side:

            Barry, I hope you found the information informative and useful.
            {{ DiscussionBoard.errors[5083634].message }}
            • Profile picture of the author Barry Unruh
              Originally Posted by dburk View Post

              Keep in mind that since there are a number of duplicates, you will often get only one page of that duplicate content into a particular SERP. This is due to the duplicate content filter. All but one page is filtered, the page that wins ranks and the first runner up is filtered.

              So, not only must you outrank all other content for your targeted keyword, you must also outrank all of the duplicates, else your page wont show in SERP at all.
              Originally Posted by dburk View Post

              Barry, I hope you found the information informative and useful.

              Don, I did find your second post very informative, but still feel your first post is inaccurate.

              "All but one page is filtered" is completely inaccurate, as you have eloquently proven.

              You can request to see all of the items in the "supplementary results" which were omitted, and their backlinks still count. If there is far too many results found, then they will eventually not be indexed, but for the sake of most of our efforts it is highly unlikely we are targeting a duplicate content item which will number in the millions, or even hundreds of thousands.

              From one of the Official Blog Posts you sent us to..

              There's no such thing as a "duplicate content penalty."

              AND

              You can help your fellow webmasters by not perpetuating the myth of duplicate content penalties! The remedies for duplicate content are entirely within your control.

              Those are some good words of wisdom...

              But I do readily admit, I learned quite a bit by reading through what you provided, and listening to some of the videos, I just reach a different conclusion about "All but one page is filtered".

              Now, to make this perfectly clear, I do not recommend the use of duplicate content. I have never used duplicate content on any of my sites, nor do I intend to.

              I do greatly appreciate your second post, it is filled with great information everyone should pay attention to. Which is why I clicked the "Thank You".
              Signature
              Brain Drained...Signature Coming Soon!
              {{ DiscussionBoard.errors[5091590].message }}
      • Profile picture of the author C Rebecca
        Originally Posted by xento View Post

        I understand crediting original sources and have no problem with that.

        However, I fail to see the logic in being so careful with duplicated content. I mean if I have a site about mustang cars and there is an awesome article about new mustangs and the author allows syndication, I would love to give this content to my visitors at the cost of a few links because it adds value to my own site. Same with news stories. If I want to provide up to date info to my visitors, then I will certainly pull the latest news concerning mustang cars.

        Also, I read elsewhere in this forum that duplicate content refers to identical content on more than one page of the same website, rather than duplication across the web, and it makes more sense since a popular article will certainly get shared all over the place and google cannot possibly penalize them all for doing this.
        Your point is absolutely right... For instance, you published syndicated content on your website (without crediting the original source)... It triggers Google algorithm for identifying duplicate content.

        No doubt, you are publishing it just for users, but Google, probably, does not know this and may filter your pages from this search results.
        Signature

        FREE 30 minutes of Ecommerce Marketing consultation. Consulted clients like Overstock.com, About.com, Lowe's and more...
        Book at: hello@techzui.com

        {{ DiscussionBoard.errors[5081797].message }}
  • Profile picture of the author AlphaWarrior
    Originally Posted by xento View Post

    Suppose you use syndicated content on your website. Now suppose also that this syndicated content is badly seo optimized, lacks a good backlinking strategy and just mostly sits on the web doing nothing.

    Do you think that if you republish this content and put some effort into promoting it, you could actually make it shine and rank it better than the original? Obviously keeping the author's resource box intact so it would be a win-win situation.
    I have actually used articles from an article directory and changed or modified the keywords for SEO purposes. But no real major changes. I have always left the resource box intact. If an author ever contacts me, I will simply remove the article from my site. But I can't imagine an author fussing since I do not change the character of the articles and I leave the resource box intact so that the authors get a link and possible viewers.
    {{ DiscussionBoard.errors[5078621].message }}

Trending Topics