8 replies
  • SEO
  • |
I've just moved to a shared hosting account and after setting up several domains as addon domains I got to wondering will google index all these addon sub-directories from the main domain?

Am I making this to complicated or can one robots text uploaded to the main domain be able to block all addon domains from being listed by google simply because I blocked them on that main domain robots.txt to keep them from being listed by google as subdirectory.maindomain.com

main domain >> 123.com
addon domain >> abc.com >> file storage structure = abc.123.com or 123.com/abc

Now because I don't want google to index every addon domain from these addon urls abc.123.com I've put up a robots text only on the main domain 123.com saying Disallow: /abc/ my question is by doing this am I inadvertently telling google not to index abc.com as well??

:confused:
#concern #robotstxt
  • Profile picture of the author Witchie
    .. was I too confusing
    {{ DiscussionBoard.errors[2357627].message }}
  • Profile picture of the author HunterSnake
    Hit me up on AIM/ICQ and I'll try to help you out. If you're setup the way I suspect, this is easy.
    {{ DiscussionBoard.errors[2357814].message }}
    • Profile picture of the author Lance K
      Good question. I've wondered the same thing.
      Signature
      "You can have everything in life you want if you will just help enough other people get what they want."
      ~ Zig Ziglar
      {{ DiscussionBoard.errors[2424752].message }}
  • Profile picture of the author ptwain
    Witchie,

    You are correct. You have to create robot.txt file for the main domain and add the disallow comman for all the add on sites on that shared hosting.

    Then you will place a seperate robot.txt file into each addon domain folder that you have on that hosting.
    {{ DiscussionBoard.errors[2425213].message }}
    • Profile picture of the author Brad Callen
      Originally Posted by ptwain View Post

      Witchie,

      You are correct. You have to create robot.txt file for the main domain and add the disallow comman for all the add on sites on that shared hosting.

      Then you will place a seperate robot.txt file into each addon domain folder that you have on that hosting.
      Correct, just add the:

      User-agent: Googlebot
      Disallow: /

      to each sub domain you want left out.
      Signature
      iWriter.com - The Original Content Creation Service. Now with over 350,000 active writers. Let us write or re-write your articles, eBooks, blog posts and more... for as little as $1.25! 3,711,814 articles written to date!
      {{ DiscussionBoard.errors[2425234].message }}
      • Profile picture of the author Lance K
        Originally Posted by Brad Callen View Post

        Correct, just add the:

        User-agent: Googlebot
        Disallow: /

        to each sub domain you want left out.
        And that won't affect the indexing of abc.com even thought it's hosted as an addon (which the host treats as a subdomain) domain @ 123.com?
        Signature
        "You can have everything in life you want if you will just help enough other people get what they want."
        ~ Zig Ziglar
        {{ DiscussionBoard.errors[2425276].message }}
  • Profile picture of the author gracioustech
    Hmm, you are right that code also dis allow your domain. I don't know why want to dis allow robots to your sub-domain.

    It is a good news that you sub-domains are also indexed by google.
    {{ DiscussionBoard.errors[2427174].message }}
  • Profile picture of the author HunterSnake
    I helped him resolve his issues over IM a while ago.
    {{ DiscussionBoard.errors[2427240].message }}

Trending Topics