Can You Tell SE's To Temporarily Not Spider Your Site?

by 4 replies
5
I am planning some site maintenance this weekend. In the last week, our traffic has really taken off and I would hate to kill the momentum by the spiders seeing 404 pages or incorrect pages until the maintenance is complete.

Is there some sort of way to tell them not to spider just for a given amount of time? :confused:
#search engine optimization #site #spider #temporarily
  • I found this in Google code...

    You can temporarily suspend all crawling by returning a HTTP result code of 503 for all URLs, including the robots.txt file. The robots.txt file will be retried periodically until it can be accessed again. We do not recommend changing your robots.txt file to disallow crawling.

    So now I ask, would it be better to simply let them see an incorrect or missing page as long as it is there the next time they spider????

    I don't want to use the noindex tag either.

    Thanks.
  • Are you using a CMS for your site? Most of the CMS allows you to temporarily close down you site. You can even personalize the message that appears while you can work with your backend.
  • If you are using XSitePro, you can untick the box where it says allow visits from search engine robots.
  • No, my CMS does not handle that.

    I thought I had seen a Google post about 6 months ago talking about this, but I can't find it.

Next Topics on Trending Feed

  • 5

    I am planning some site maintenance this weekend. In the last week, our traffic has really taken off and I would hate to kill the momentum by the spiders seeing 404 pages or incorrect pages until the maintenance is complete. Is there some sort of way to tell them not to spider just for a given amount of time? :confused: