Can You Tell SE's To Temporarily Not Spider Your Site?

4 replies
  • SEO
  • |
I am planning some site maintenance this weekend. In the last week, our traffic has really taken off and I would hate to kill the momentum by the spiders seeing 404 pages or incorrect pages until the maintenance is complete.

Is there some sort of way to tell them not to spider just for a given amount of time? :confused:
#site #spider #temporarily
  • Profile picture of the author Oneal Degrassi
    I found this in Google code...

    You can temporarily suspend all crawling by returning a HTTP result code of 503 for all URLs, including the robots.txt file. The robots.txt file will be retried periodically until it can be accessed again. We do not recommend changing your robots.txt file to disallow crawling.

    So now I ask, would it be better to simply let them see an incorrect or missing page as long as it is there the next time they spider????

    I don't want to use the noindex tag either.

    Thanks.
    {{ DiscussionBoard.errors[3710890].message }}
  • Profile picture of the author ganesh
    Are you using a CMS for your site? Most of the CMS allows you to temporarily close down you site. You can even personalize the message that appears while you can work with your backend.
    Signature

    {{ DiscussionBoard.errors[3710943].message }}
  • Profile picture of the author calfred
    If you are using XSitePro, you can untick the box where it says allow visits from search engine robots.
    Signature

    Please do not use affiliate links in signatures

    {{ DiscussionBoard.errors[3710978].message }}
  • Profile picture of the author Oneal Degrassi
    No, my CMS does not handle that.

    I thought I had seen a Google post about 6 months ago talking about this, but I can't find it.
    {{ DiscussionBoard.errors[3712884].message }}

Trending Topics