Disallowing a subdomain in robots.txt

by halmo
2 replies
Would someone please tell me how to disallow a sub-domain, and also a specific directory under a sub-domain in a robots.txt?

e.g. I want to disallow sub.example.com
and also want to disallow eso.example.com/directory1

I know how to disallow a regular directory, but don't know how to do that with sub-domains.

Thanks for help!

Edit: Nobody replied yet, and I just found the answer, so I thought I would post it here in case other people might want to know. The answer is that you need to include a separate robots.txt for each subdomain because search engines views subdomains as separate sites. Hope this helps out someone else too.
#disallowing #robotstxt #subdomain
  • Profile picture of the author mojojuju
    Originally Posted by halmo View Post


    Edit: Nobody replied yet, and I just found the answer, so I thought I would post it here in case other people might want to know. The answer is that you need to include a separate robots.txt for each subdomain because Google views subdomains as separate sites. Hope this helps out someone else too.
    Hey, it's very cool of you to post the solution you found.
    Signature

    :)

    {{ DiscussionBoard.errors[3237067].message }}
  • Profile picture of the author kenetix
    yup. Or you could use htaccess to block access to the domain, that'd prevent robots from indexing your dir.
    Signature

    Free advertising for your services - beam.biz

    {{ DiscussionBoard.errors[3246131].message }}

Trending Topics