Demo Web Page - Is Duplicate Content a Problem

9 replies
  • SEO
  • |
So I've got a demo domain for some websites that I'm putting together - each sites will go in their own directory, for example...

www.demowebsite.com/site1
www.demowebsite.com/site2 etc.

Once a site is ready to go live I'll delete it from the demo domain and put it live on its own domain.

My question is: will this give me duplicate content issues with Google?

If the answer is yes... can I simply de-index the domain and block all robots with robots.txt to prevent and problems?
#content #demo #duplicate #page #problem #web #website
  • Profile picture of the author yukon
    Banned
    Why not use Lorem Ipsum as the content/text, you said they're demo sites.

    At the very least use unique page titles If you care about ranking the demo page in Google SERPs.

    Usually most people rank the sales page & link out to the demo from the sales page.
    {{ DiscussionBoard.errors[8549058].message }}
  • Simple ryan,

    I would use webby’s advice and just blog content now. Of course if its already indexed there isn’t much you can do. Just redirect the demo domain using a server configuration file and you should be fine.

    Best,

    Shawn
    Signature
    Outsource to the experts...

    We customize your Blog, eBook, Press Release and Sale Copy content with your message.

    {{ DiscussionBoard.errors[8549104].message }}
  • Profile picture of the author Simple Ryan
    Thanks for the advice - I've gone and setup a robots.txt blocking the sites.

    Do you think that it's necessary to de-index the domain with a meta no index as well?
    {{ DiscussionBoard.errors[8551036].message }}
    • Profile picture of the author yukon
      Banned
      Originally Posted by Simple Ryan View Post

      Thanks for the advice - I've gone and setup a robots.txt blocking the sites.

      Do you think that it's necessary to de-index the domain with a meta no index as well?
      You can noindex the page/s you don't want in the SERPs but they'll remain in the SERPs until Googlebot tries to reindex the page/s.
      {{ DiscussionBoard.errors[8551059].message }}
      • Profile picture of the author Simple Ryan
        OK - I can see that this might have the potential to cause a catch 22 situation?

        If I noindex the pages and they remain until the Googlebot goes to reindex them... and the Googlebot is blocked by robots.txt so it can't reindex them... then they'll remain indexed???
        {{ DiscussionBoard.errors[8551074].message }}
        • Profile picture of the author yukon
          Banned
          Originally Posted by Simple Ryan View Post

          OK - I can see that this might have the potential to cause a catch 22 situation?

          If I noindex the pages and they remain until the Googlebot goes to reindex them... and the Googlebot is blocked by robots.txt so it can't reindex them... then they'll remain indexed???
          Hold off on the robots.txt for now, noindex whatever pages you don't want in the SERPs, next try & get Googlebot back on your pages/URLs to try & reindex the now noindexed pages.

          Keep an eye on the SERP indexed pages your trying to noindex, when you see the pages are removed from the SERPs you can go back & add the robots.txt as a backup to keep Googlebot from indexing the pages.

          The noindex tag should take care of removing the pages from the SERPs.
          {{ DiscussionBoard.errors[8551092].message }}
          • Profile picture of the author Simple Ryan
            Thanks Yukon - sounds like a good method!

            First I'll check whether the pages have actually been indexed or not (they may not have yet).

            I'm wondering whether I can use the Webmaster Tools 'Remove URL' tool instead of 'noindex' becasue it might work without bots needing to crawl the site?
            {{ DiscussionBoard.errors[8551161].message }}
            • Profile picture of the author yukon
              Banned
              Originally Posted by Simple Ryan View Post

              Thanks Yukon - sounds like a good method!

              First I'll check whether the pages have actually been indexed or not (they may not have yet).

              I'm wondering whether I can use the Webmaster Tools 'Remove URL' tool instead of 'noindex' becasue it might work without bots needing to crawl the site?
              You can use WMT but you still need to noindex the pages you don't want showing up in the SERPs. Google has different types of bots indexing pages, plus there's other search engines that will eventually find the noindex (ex: Yahoo, Bing, etc...).

              The first thing I would do is add the noindex tag to the problem page/s, then try the WMT link removal to maybe help get Googlebot back on the web pages faster.
              {{ DiscussionBoard.errors[8551185].message }}
  • Profile picture of the author Simple Ryan
    Thanks Yukon - great advice!
    {{ DiscussionBoard.errors[8551345].message }}

Trending Topics