How do I get around dupe content problem for my client?

by TimD
21 replies
I have a client -
He's been parking a bunch of city domains
snoqualmiecarrepair.com
monroecarrepair.com
etc.
on this regional domain
pugetsoundcarrepair.com

Now for marketing reasons, he wants to put a site on each of these domains.
But they will be duplicate content and Google is saying they don't want to list them all (I've been on the webmaster tools forum).

What should I do?
Google Places?
PUt up the sites anyway and hope Google overlooks them?
I can spin the content but the client thinks there's good reason for him to invest in one site and then get cheap copies after that. Besides, he has very little content on the site, so spinning will be tough.

What do you suggest?
#client #content #dupe #problem
  • Profile picture of the author VegasGreg
    Just rewrite the main content on each site slightly to make it look and feel different. You can outsource this pretty cheap. Use a different writer for each rewrite to get a unique angle on each one.

    Unless the business has a unique address, phone number and business license for each domain name you can't use Google Places except for the main/real business name.
    Signature

    Greg Schueler - Wordpress Fanatic... Living The Offline Marketing Dream...

    {{ DiscussionBoard.errors[4763223].message }}
  • Profile picture of the author MikeFriedman
    What you are talking about is not an issue with duplicate content.

    Google indexes duplicate content ALL the time.

    For example, find a popular article from the Associated Press. It has been syndicated to the New York Times, LA Times, Baltimore Sun, Washington Post, etc. And it is probably indexed at all of those sites.

    Another example. Take a franchise like Planet Fitness. If you visit the page for an individual site, it is the exact same as all the rest of the individual pages except that the address, phone number, hours, etc. are different.

    What you are looking at doing, you are not going to have a problem with.
    Signature
    Get the TIPS and STRATEGIES I use to HELP businesses GROW through SEO.
    Delivered to you each week!

    >>> Sign Up Now <<<
    {{ DiscussionBoard.errors[4763271].message }}
    • Profile picture of the author TimD
      Originally Posted by MikeFriedman View Post

      What you are talking about is not an issue with duplicate content.

      Google indexes duplicate content ALL the time.

      For example, find a popular article from the Associated Press. It has been syndicated to the New York Times, LA Times, Baltimore Sun, Washington Post, etc. And it is probably indexed at all of those sites.

      Another example. Take a franchise like Planet Fitness. If you visit the page for an individual site, it is the exact same as all the rest of the individual pages except that the address, phone number, hours, etc. are different.

      What you are looking at doing, you are not going to have a problem with.
      This is great to hear, and the response I got from Google Webmaster Tools forum is that what I'm describing will create duplicate content. That a search on monroe car repair will probably bring up both monroecarrepair.com and snoqualmiecarrepair.com. And when Google sees this, they will filter one out.

      How do I know?
      {{ DiscussionBoard.errors[4763532].message }}
      • Profile picture of the author MikeFriedman
        Originally Posted by TimD View Post

        This is great to hear, and the response I got from Google Webmaster Tools forum is that what I'm describing will create duplicate content. That a search on monroe car repair will probably bring up both monroecarrepair.com and snoqualmiecarrepair.com. And when Google sees this, they will filter one out.

        How do I know?
        That would only happen if both are optimized for "monroe car repair". Even in that case, there is a chance that both might show up.
        Signature
        Get the TIPS and STRATEGIES I use to HELP businesses GROW through SEO.
        Delivered to you each week!

        >>> Sign Up Now <<<
        {{ DiscussionBoard.errors[4763612].message }}
        • Profile picture of the author Ghalt
          I saw a video of Matt Cutts (from Google) who specifically addressed the idea of duplicate content when it comes to storefronts with multiple addresses.

          He specifically said that they prefer to see a distinct page for each address. So, for example, having a site that uses a script to do a "store finder", and all store locations are contained within the the same page is NOT the preferred way to do it. He said he preferred to have each page be a standalone page.

          The idea being that someone in Monroe who is searching wants to see the site for the location near them, not somewhere else.
          Signature
          Got eBook?

          Save money - make your eBook Cover yourself with free software (GIMP), and our detailed guide: http://www.makeebookcovers.com
          {{ DiscussionBoard.errors[4763744].message }}
          • Profile picture of the author TimD
            Ghalt, I'm grateful for your ideas. My challenge is that this customer doesn't want to put separate pages for cities on one master website. He wants to put up separate websites for different cities but all with the same content.

            Thanks for your input.
            {{ DiscussionBoard.errors[4767451].message }}
  • Profile picture of the author Joshua Morris
    Hey guys,

    In my experience... and in watching pretty much everything matt cutts says, its quite apparant the duplicate content is a big issue!!

    Its not a problem when you are just trying to INDEX page, but if you want to RANK them, then it will be.

    Google do not like seeing any duplicate content in the top rankings, I find if i have duplicate content on my pages then most of the pages will not rank higher than 20, although the main page might get to the top.

    Also I have seen improved rankings across the board for clients site, but then they asked me to put some content on each page and it should be the same.... at the time i wasnt sure about it, but did it anyway. The results were that 10 of the 11 pages in the site dropped from page 1 to page 3, and the main site dropped to page 2 as well.

    As soon as i changed it back and put unique content on each page and bookmarked or pinged it, the rankings jumped back up.

    I have seen this happen with many of my affiliate sites and its very obvious that duplicate content IS an issue when you are talking about RANKING.

    With the new panda update google is enforcing this even stronger!!!.... Having UNIQUE content on each page is crucial.

    The best way is to outsource the creation of 5-10 GOOD articles..... i include this in my fee, and charge £50 for it, then outsource it for $5 per article.
    The cheaper option is spinning an article from ezine... making sure you spin to a uniqueness of 40%+ this will get you off googles radar.

    Josh
    {{ DiscussionBoard.errors[4768212].message }}
  • Profile picture of the author Vawriters
    I think Google is against dupe content since beginning only. They clearly mention that they will rank unique content higher in SERP and Google's ability of doing so has been increased drastically after "Panda algorithm". So I believe, you need to explain this to your client and get article writing done from some one who knows the topic very well.

    Also apart from unique content, now Google gives importance to the design and "user engagement matrix". So, clean user friendly design has also got equal importance.I hopr this helps.
    {{ DiscussionBoard.errors[4769094].message }}
  • Profile picture of the author JBroyer44
    So I am in the process of building city based landing pages for my client. Each page is promoting the same service just targeting a different town. I am re-writing key paragraphs and sentences but not the meat of the content.

    When I check out my competitions sites they are doing the same content on every page and just changing the city name.

    I am targeting my clients top 10 cities so re-writing a services page 10x is kind of tough as there is not a lot of meat to it.

    I am thinking each page is trying to rank for a different keyword essentially so google will not be displaying 2 pages with the same content on that search query, so why would it be a problem?
    Signature

    "The force is strong with this one"
    Facebook Ad Services: http://sellabletraffic.com

    {{ DiscussionBoard.errors[4769659].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by JBroyer44 View Post

      I am thinking each page is trying to rank for a different keyword essentially so google will not be displaying 2 pages with the same content on that search query, so why would it be a problem?
      This is one of the key points everyone is missing in their haste to get their post count up.
      Signature
      Get the TIPS and STRATEGIES I use to HELP businesses GROW through SEO.
      Delivered to you each week!

      >>> Sign Up Now <<<
      {{ DiscussionBoard.errors[4769697].message }}
      • Profile picture of the author TimD
        Quote:
        Originally Posted by JBroyer44
        I am thinking each page is trying to rank for a different keyword essentially so google will not be displaying 2 pages with the same content on that search query, so why would it be a problem?

        This is one of the key points everyone is missing in their haste to get their post count up.

        _________________

        Yes, I agree with Mike and J that this is the nut of the issue. Does Google Panda attempt to eliminate duplicate content of any kind, or do they simply want t stop duplicate content showing up on page one for a specific keyword.

        In that case, they would be fine for me to have a site snoqualmiecarrepair.com that used the same content as monroecarrepair.com. They simply would not rank both sites on page one for the term "monroe car repair".

        If that's the case, I think I'm ok.
        {{ DiscussionBoard.errors[4772946].message }}
        • Profile picture of the author MarketologyTeam
          I've got a system that will change your DUP content (or any other content) into 100% Unique and 100% copyscape passed content. NO REWRITES, NO SPINNING.

          It basically encodes the stopwords, leaving your keywords alone. To the naked eye the words and article or whatever "appear" unchanged. However, if you run the SAME article back through copyscape this seemingly unaltered article will show up 100% copyscape premium passed.

          For the sake of full disclosure, yes I am offering this for sale soon.

          BUT...not now, and I would like to give the OP a few FREE copies for his clients city sites . PM me any info and I'll get them to you fast.
          Signature
          UNLIMITED CONTENT - 100% CopyScape Passed, EZINE Quality Articles. PM me for info.
          {{ DiscussionBoard.errors[4832978].message }}
          • Profile picture of the author TimD
            Wow, that's a very generous offer.

            At the same time, I don't want to build a client website based on exploiting a temporary hole in Google's search algorithm. This is a going concern for my client that he's been operating for years and hopes to continue operating.

            He doesn't have the "whatever works for now" mentality that many internet marketers share. He wants to build lasting assets.

            That said, I'm sincerely grateful for your generosity and hope the software works well for you.
            {{ DiscussionBoard.errors[4833368].message }}
            • Profile picture of the author MarketologyTeam
              Originally Posted by TimD View Post

              Wow, that's a very generous offer.

              At the same time, I don't want to build a client website based on exploiting a temporary hole in Google's search algorithm. This is a going concern for my client that he's been operating for years and hopes to continue operating.

              He doesn't have the "whatever works for now" mentality that many internet marketers share. He wants to build lasting assets.

              That said, I'm sincerely grateful for your generosity and hope the software works well for you.
              Thanks. No worries
              Signature
              UNLIMITED CONTENT - 100% CopyScape Passed, EZINE Quality Articles. PM me for info.
              {{ DiscussionBoard.errors[4833437].message }}
              • Profile picture of the author Bon508
                There is much debate about what constitutes Google's "duplicate content penalty," so I go straight to the source: Google:

                "In some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results."
                and

                "... in the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results."
                On a prospect's site I'm looking at right now, his SEO person has created many IDENTICAL pages with only the name of the city changed -- for about 20 cities. I do understand the purpose of doing that; his site is currently ranking well for various cities -- but as others have said, there shouldn't be repeats of that content coming up in the search results. And it does not appear to "deceive" users. On the other hand, it may seem like its intent is to "manipulate" the rankings.

                So as usual, I'm confused. :confused:

                Should I advise him to have those (many) pages rewritten so they're not duplicate content, or not worry about it?

                Is this a legitimate way to rank one website for several cities?
                Signature
                {{ DiscussionBoard.errors[4893362].message }}
                • Profile picture of the author TimD
                  On a prospect's site I'm looking at right now, his SEO person has created many IDENTICAL pages with only the name of the city changed -- for about 20 cities. I do understand the purpose of doing that; his site is currently ranking well for various cities -- but as others have said, there shouldn't be repeats of that content coming up in the search results. And it does not appear to "deceive" users. On the other hand, it may seem like its intent is to "manipulate" the rankings.

                  So as usual, I'm confused. :confused:

                  Should I advise him to have those (many) pages rewritten so they're not duplicate content, or not worry about it?

                  Is this a legitimate way to rank one website for several cities?
                  I agree, this is the nut. I used to do pages like those you describe as well. Holly Cotter called them "flypaper pages" because you could take the text and optimize it for a city and it would rank well for that city.

                  The problem is that it doesn't matter whether the practice is ethical - which you and I seem to agree it is - the problem is whether Google's automated algorithm will catch it and take the page out of the index.

                  For that, we need someone with a technical understanding or experience with the way Google is handling this post-Panda.
                  {{ DiscussionBoard.errors[4893670].message }}
                  • Profile picture of the author Silent Warrior
                    I've never had a problem ranking with duplicate content for local search. All the results I've seen so far prove to me that all you ever need to do is change the city location and your keyword search term is changed. Which is enough for local search. But if this is something you're really worried about for a client, why not just do as suggested before, and outsource new articles? You're only talking about a few sites and you could load them up with unique content for a few bucks. Just upsell the client on your unique services.
                    {{ DiscussionBoard.errors[4894036].message }}
                    • Profile picture of the author TimD
                      Originally Posted by Silent Warrior View Post

                      I've never had a problem ranking with duplicate content for local search. All the results I've seen so far prove to me that all you ever need to do is change the city location and your keyword search term is changed. Which is enough for local search. But if this is something you're really worried about for a client, why not just do as suggested before, and outsource new articles? You're only talking about a few sites and you could load them up with unique content for a few bucks. Just upsell the client on your unique services.
                      I may need to do what you're suggesting. My challenge is that the client cares A LOT about what shows up on his website. I created a lot of content for him. And I would estimate that he pared away 60% or more of each page and left it very bare bones. He feels that's what sells in his market. I can always put some city-specific stats in there but I can guarantee he'll want to delete them.

                      Thanks for the comments.
                      {{ DiscussionBoard.errors[4894068].message }}
                      • Profile picture of the author bryson
                        Originally Posted by TimD View Post

                        I may need to do what you're suggesting. My challenge is that the client cares A LOT about what shows up on his website. I created a lot of content for him. And I would estimate that he pared away 60% or more of each page and left it very bare bones. He feels that's what sells in his market. I can always put some city-specific stats in there but I can guarantee he'll want to delete them.

                        Thanks for the comments.
                        Based on what you have said so far about your client, leave the meat of the content as they want, change the key words, do whatever else you were planning to get them to rank and if it works, great, if not, maybe your client will take your unique content suggestions next time.
                        Signature
                        Do You Do Local Lead Gen? Hate Cold Calling for New Clients? We Have the Solution.. Get Hot Offline Leads Today... These Leads WANT to hear from YOU!

                        Sign Up Today and Get Clients!Local Lead Gen Leads
                        {{ DiscussionBoard.errors[5268163].message }}
  • Profile picture of the author mrmatt
    What I do is take the article and spin it.

    I re-write each sentence 10 times.

    Throw it into the best spinner to output a unique article.

    Then I go back through it and add the key word or geo keywords through out.

    The re-writing 10 times may be a little over kill.
    {{ DiscussionBoard.errors[4894105].message }}
  • Profile picture of the author seomanifest
    I have a similar problem where a large manufacturer is creating sites for all their dealers, obviously all the product pages are identical, the contact pages and about pages are different but that is about it. I can see the same identical problem for car dealership sites, what do you suggest in such a scenario?
    {{ DiscussionBoard.errors[5267889].message }}

Trending Topics