Question regarding duplicate content

3 replies
  • WEB DESIGN
  • |
I just recently made a site with a .net domain coz .com was taken but it is still under development and its taking the web developer forever to finish it so I've decided to start a .org one. Here is the thing:

- I've uploaded some contents for the articles and salesletter in the .net domain. I did a google search and they have been indexed. Is there a way to work around this so i can bring my contents to a different domain name and not being penalized for duplicate content?
#content #duplicate #question
  • Profile picture of the author duanecilliers
    Hey Caliant.

    I have read a hell of a lot about duplicate content and put it through severe testing. This is a brief summary of my findings.

    Duplicate content is only duplication when you have the same content in different locations on the same site (This is where canonical URLs come in handy). Not when you have the same content on different sites. How do you think social bookmarking and news sites work without being penalized?

    Read Matt Cutts blog (Can't post links yet, sorry). He has quite a lot of info regarding the matter.

    People tend to pass on rumors whether they are sure of their facts or not. It's best to make sure before spreading the word people
    {{ DiscussionBoard.errors[1382298].message }}
  • Profile picture of the author caliant
    Thanks for the replies. I found this on Googles policy page.

    Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools.

    I plan to test out multiple squeeze pages with mostly identical contents.

    Now is it still a good plan to deindex my squeeze pages and salesletter? I'm a newbie, I plan to promote them using Adwords, and I'm aiming for the highest Quality Score possible. After reading the above google policy, what are your thoughts in the best solution to avoid duplicate content penalty? *other than not having duplicate content in the first place*
    {{ DiscussionBoard.errors[1446717].message }}

Trending Topics